This is a quick question for anyone who might be familiar with hybrid dynamical systems.
We can represent a vector field v on a manifold X as a \emph{derivation} of the ring of scalars on X, i.e. a linear D_v \colon C^\infty(X) \to C^\infty(X) that satisfies the Leibniz law
D_v(fg) = D_v(f) g + f D_v(g)
D_v can be interpreted as giving the “derivative of f along v” (and of course, that is how it is defined).
Could we generalize this to a map D_v \colon C^\infty(X) \to \mathcal{D}'(X), where \mathcal{D}'(X) is the space of distributions on X? Then we could describe dynamical systems like a bouncing ball, where the momentum of the ball changes discontinuously, so the derivative of the momentum is in essence like a delta-distribution. Then a trajectory \gamma \colon I \to X would have to satisfy something like an integral equation, where for all f \in C^\infty(X), \omega \in C^\infty_c(I)
I’m not a big expert in distributions or hybrid dynamical systems, so I’m not sure whether or not this is done. I tried skimming a couple textbooks on distributions, but I couldn’t find anything quite like this in them.
There is something called weak solutions (Weak solution - Wikipedia) in differential equations, which is very similar, but usually a matter of asking for a distribution which solves a normal differential equation (although the meaning of this is necessarily an integral equation of the type you’re talking about).
I know a bit about distributions and I’ve never seen this particular definition. I’ve seen something similar, which however doesn’t seem to be exactly what you want.
There is a well-developed theory of currents: the space of p-currents is defined as the topological dual of the topological vector space of compactly supported smooth p-forms.
A smooth vector field on, say, a Riemannian manifold defines a 1-current: to pair a smooth vector field and a compactly supported smooth 1-form, pair them pointwise to get a compactly supported function, then integrate that to get a function. I’m using the Riemannian metric only so we can integrate compactly supported functions and get numbers. I don’t need the full force of the Riemannian metric: only the ‘Lebesgue measure’ that it gives. (Any Riemannian manifold has a ‘Lebesgue measure’.)
More generally, on a Riemannian manifold I’d be willing to guess that your sort of distributional vector field (= derivation from smooth functions to the topological dual of the space of compactly supported smooth functions) gives a 1-current and vice versa.
However, the need to choose a Riemannian structure (or more generally a nice enough sort of measure) to turn distributional vector fields into 1-currents is ugly. They are distinct ideas on a general manifold.
So, what I’d want to do is show the space of your distributional vector fields is the topological dual of some topological vector space. Not the space of compactly supported 1-forms. Instead, a suitable space of compactly supported 1-form densities.
This probably sounds complicated but I swear it’s a reasonable idea. The thing is, people tend to learn a lot about tensors but not enough about tensor densities. We often need them to solve problems exactly like what I’m talking about here: your distributional vector fields are sort of like elements of the dual of the space of compactly supported smooth 1-forms, but not quite.
I could say more but it’s better to see if you care at all about this direction of thought. It doesn’t help you directly with the question of whether you can use your distributional vector fields to formulate - and, much harder, solve! - generalized first-order ODE.
One other thing about your idea. You seem to be defining distributions as the topological dual of the space of compactly supported smooth functions, and then defining distributional vector fields as derivations from the algebra of smooth functions to the module of distributions. So you’re basing your definition on two algebras: the algebra of smooth functions and the algebra of compactly supported smooth functions. I think your definition would be formally cleaner if you used just one.
There are two ways to change your definition so that it uses just one of these algebras. Either of these would make it easier to prove simple algebraic facts about distributional vector fields, since you’d be messing around with just one algebra instead of two. But you’d have to look at both candidates and see if they do what you want. (On a compact manifold both candidates are the same, so this is a fairly subtle issue and maybe you shouldn’t worry about it yet.)
If I remember correctly, the dual of the space of smooth functions is the space of compactly supported distributions.
On reflection, all my comments are just a way of saying “this is a cool idea, and I think it can be polished up to make it nicer”. You’ve got a concept of distributional vector field; we can try to tweak it a bit and think about it in different useful ways. But the hard part is trying to prove any theorem about existence of solutions of
\dot{x}(t) = v(x(t))
where v is a distributional vector field. For this I’d start by working in \mathbb{R}^n and forget the nonsense about manifolds. Then a vector field can be thought of as an n-tuple of functions and a distributional vector field had better amount to the same thing as a n-tuple of distributions. People must have thought about this.
In fact I’d start with n = 1 and see what people have to say about that case! I’d ask about that case on MathOverflow.
The idea of 1-currents seems like it intuitively captures something similar to this, but I wonder if anyone has used 1-currents to handle elastic collisions (that’s the main reason I’m interested in this).
Now that I think of it… perhaps the way to do elastic collisions is via a Lagrangian approach, similar to piecewise differentiable shortest path problems that I’ve seen for the path of a light beam. If so, it would be neat to generalize the Lagrangian → Hamiltonian transition to use these kind of ODEs… I.e., if we poke holes in the space of many body configurations that correspond to the disallowed overlapping states, then look for paths which minimize the action but do not go through those holes, then maybe we can turn it into an ODE where you get a delta when you hit the holes… But I’m just thinking out loud here, I’m not sure if this makes sense.
The only treatment of elastic collisions I’ve seen has been purely from a numerical perspective, i.e. in video game engines, it seems strange to me that I haven’t seen much on doing it geometrically/analytically.
People on MathOverflow prize very specific questions, and I don’t think you managed to capture what you’re really interested in with this question. The score on your question has dropped from 1 to 0 since I last checked, and the one answer amounts to saying “well, the differential equation you wrote down is really easy - what’s the big deal?” You should generally assume people will take you literally on MathOverflow and try to shoot down your question as efficiently as possible, interpreting it uncharitably. Remember, they’re mathematicians.
I would have asked something like
"Under what conditions on a distribution v on the real line have people proved existence and/or uniqueness for the differential equation
\dot{x}(t) = v(x(t)) ?
We will in general need x to be a weak solution of some sort, unless v is actually a Lipschitz function, which allows us to apply the usual Picard-Lindelöf theorem. I am mainly interested in the case of distributions that are not functions."
Or something like that. I might include an example that’s sort of interesting.
That looks like the answer to a different, and much easier, question. They claim your question is “basically about the existence of distributional antiderivatives”, i.e. given a distribution S on the real line is there a distribution T with
T' = S
If that’s really what you’re wondering about then you’re in luck. The answer is “yeah, sure, it’s in every textbook - here’s how it works.”
I thought you were interested in the question: given a distribution S on the real line is there some sense in which we can find a function x : \mathbb{R} \to \mathbb{R}, or maybe a distribution, that solves the differential equation
\dot{x}(t) = S(x(t)) \; ?
You can’t answer this just by taking the antiderivative of S. When S is a function the textbook approach is to separate variables:
\displaystyle{ \frac{dx}{S(x)} = d t}
then integrate:
\displaystyle{ t + C = \int \frac{dx}{S(x)} }
and then do the integral, and solve for x as a function of t. All of this is problematic for distributions. There are better ways to deal with 1st-order ODE but again it’s not obvious that they work in this context. You mentioned one nice way in your blog article - convert it to an integral equation! But you didn’t mention that on MathOverflow. So the people answering took the path of least resistance.
This is why I like working with manifolds, when everything is on the real line it’s harder to spot type errors! OK, I’ll re-edit my question to clarify.
They’re not going to like if if you completely change the question after someone wrote a long, correct answer to your original question. Ask a new, different question! Otherwise you’ll learn the ethos of MathOverflow the hard way.