• If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!



Page history last edited by PBworks 16 years, 10 months ago

Adam's talk


New Foundations


intro story to provide motivation for this approach:


I've done some work recently on some cutting edge theory in machine learning in the context of a class project. The professor's work was very interesting but it was only comprehensible to the most dedicated of us in the class. The work was heavily dependent on advanced mathematics, statistics, and even some recent results from quantum physics -- not to mention the previous work in machine learning that it sat on top of.


Unlike most other work, however, the results were suspiciously universal. They suggested that the things we were working with in this context that sat high on an immense pile or work from different disicplines was somehow more fundamental than any of the parts we built them out of.


This experience confirmed a growing suspicion of mine that almost all of the really advanced knowledge we have doesn't need to be as complex as it seems, that much of its complexity is an artifact of the particular order the things that is defined in terms of was developed.


So, if we were going to knock down all of our knowledge and rebuild it in an intelligenty organized and efficient to learn matter, such that it had at least, if not more expressive power than the old organization, what would we put at or near the bottom? What do people really have to "just accept" on the promise that it will make understanding the rest so much easier? What should we commit to being "actually real" that will justify our claims set atop?


A definition


"foundations" => systems of thought that are defined in terms of a minimal context that can serve as a basis for the justification of other systems of thought and provide the context for bodies of knowledge


ideally the foundation is just few core definitions (axioms) and a central story about what the things defined mean in the non-abstract universe (intuition)


ideally they are easy to learn


ideally they are able to directly host a large number of interesting and diverse bodies of knowledge that have central stories that cleanly mesh with the foundational story


Let me tell you about two foundation-candidates I'd like to pull down from the ivory tower and set on the floor.


The first foundation I propose is a kind of mathematics called "geometric algebra".


Technically, the universal geometric algebra is "an infinite dimensional real algebra with a geometric interpretation", but it is much easier to learn it simply as "geometry done right, done right for every place you could ever want to apply it".


This approach is technically older than the far more popular linear algebra that serves as a foundation for so much engineering today. However, general purpose texts on the subject weren't available until, oh, somewhere between just a few years ago and not yet depending on your background.


Apart from a precise axiomatic definition (which is quite simple actually), the premise is that there are things called vectors and you can combine them with a geometric sum and geometric product (producing a rich ecosystem of other types of geometric objects including the familiar real numbers). The central story is that vectors are directed one dimensional spaces, lines that go a particular direction, and you make more interesting spaces by combining them.


Because of the infinite dimensional nature of the definition, there is a subalgebra with precisely the same, simple rules as the general case that can be applied to problems in whatever space they live (which is presumably some specific corner of the enclosing infinite dimensional space). There "are" no complex numbers, matrices or other things that might claim to be more fundamental than real vectors. You may certainly construct them in geometric algebra, but the way you construct them will give you a story about what they are in terms of directed spaces (which the user of this language would be deeply familiar).


Because this approach is universally applicable, with respect to the many fields that make use of linear algebra, geometry and its cousins, computer scientists, engineers, physicist can share a common, formal language. Physicists are notorious for inventing their our systems of mathmatics that are incompatible with terminology from other fields, making it harder to see when efforts are duplicated, wheels reinvented. It is somewhat fitting, then, that geometric algebra grew out of a time when the people who studied physics were not distinguished from those who studied mathematics.


Lofty possibilities for academia aside, the approach is simple and intuitive enought that it can be taught to kids (or so I and others claim). Because there is simple far less complexity, geometric algebra could easily be introduced as early as geometry and elementary algebra are now. It wouldn't be the "kids version" it would be the "real thing".


I want to be talking about new foundations, plural, in this talk, so I should move on.


The second foundation I propose is "inferential calculus". Scary title aside, this is no less than the unification of logic and bayesian probability theory, rational reasoning, under uncertainty or not, fully formalized.


Bayesian approaches, those often characterized by the use of prior (subjective) beliefs in computation, were once rejected on the grounds that good science is purely objective. However, in modern times, physics is telling us that the observer does matter, critically, and so, I claim, do their beliefs. In the inferential calculus, otherwise thought of as the processes by which you may infer things, beliefs are first class objects.


Now pervasive any many fields, the inferential calculus provides a unified and, again, universal language to describe the process by which information held by a given agent (its beliefs) are updated in response to outside data. All of existing knowledge relating to the processing of statistics can be re-seated in the context of a bayesian approach and automatically gain a story about what is going on in a computation in terms of beliefs (or at least bayesians claim such). If demanded, the foundations of this approach can be justified in terms of information theory. However, interestingly, if this approach is taken for granted, it can be used to justify all of information theory. Ok, enough of this tangent.


The question "what is the probability that a coin flipped twice will yield heads twice?" is an opinion question, with more than one right answer, at least without additional details. Students struggle with probability theory. As with geometric algebra, bundling intuition with formal definitions make it easier to learn. We introduce probability in middle school now, but dare not discuss the fundamentals of rational thought in the same class. With the inferential calculus, these are the same.


These are just two ideas that I think deserved to be rescued from their current place where they are only accessible to those with year and years of advanced training. They are simple, powerful, universal, and intuitive. Best of all, as foundations, they can be taught to the young.


If we can teach less and have students learn more, we are not just being more efficient, we are being more effective.


Now, I pose some open questions:

- What other systems of though have these properties of good foundations?

- Can we identify any current foundations that can be eliminated and rehosted in terms of other such that the result is simpler and more powerful?

- What would it take to implement these new foundations?

Comments (0)

You don't have permission to comment on this page.