Great stuff! There should be loads of things to do with the basic idea.
David Jaz Myers told us of an approach to the nearest neighbor algorithm which sees it as form of extension problem and then uses enriched profunctors.
It does seem to be a form of induction.
Regarding abduction, sometimes we have that B \to C already in place, but sometimes there are a range of candidates, B_i \to C, and we’re to select one. Sometimes we have a number of arrows f_i: A \to B [corrected] and we’re selecting one. Sometimes abduction is also used for the process of coming up with such a B in the first place.
If we take things in a Kan extension/lift direction, it may be interesting to reflect on the asymmetry (why the former is so much more common). Todd Trimble offers an interesting explanation here.