.

Secret Intelligence Service

.

Artificial Intelligence Primer

The Future State Project

.

.

Iteration as of 2018

Research scientists are looking to the brain for new insight and certain are investigating what may at first appear to be an unlikely beginning point; olfaction, the sense of smell. The aim has been to achieve a better understanding of how organisms process chemical information, and now has been uncovered coding strategies that appear do  especially relevant to problems in Artificial Intelligence. Olfactory circuits bear very similar similarities to more complex regions of the brain and which have been of interest in the quest to construct better machines.

Computer scientists are now beginning to probe these findings in machine learning contexts.

State-of-the-art machine learning techniques in operation today are built at least in part so to mimic the structure of the visual system, which is based on the hierarchical extraction of information. When the visual cortex receives sensory data, it first extracts small, well-defined features; colours, edges, textures, which involve spatial mapping. Neuroscientists discovered during the 1950s and 60s that specific neurons in the visual system correspond to the equivalent of specific pixel locations in the retina.

As visual information is passed along through layers of cortical neurons, details regarding colours, edges and textures assemble so to form increasingly abstract representations of the input; i.e., that the object is a human face, and that the identity of the face is Harriet, for example. Each layer in the network assists the organism to achieve that goal.

Thereby deep neural networks were built to work in a similarly hierarchical way, leading to a revolution in machine learning and AI research. To teach these nets to recognise objects such as faces, they are fed thousands of sample images. The system strengthens or weakens the connections between its artificial neurons to more accurately determine that a given collection of pixels forms the more abstract pattern of a face. With sufficient samples, it can recognise faces in new images and in contexts that before it has not.

There has been great success with these networks, not only in image classification but also in speech recognition, language translation and other machine learning applications. One can think of deep nets as being freight trains and which are very powerful, so long as there is reasonably flat ground, where one can lay down tracks and have a huge infrastructure. But it is the case that biological systems do not require all that – that they can deal with difficult problems that deep nets are unable at this point in time.

A current topic in AI is self-driving cars. While a car navigates a new environment in real time, an environment that is constantly changing, one full of noise and ambiguity – deep learning techniques inspired by the visual system might well fall short. Thereby methods based loosely on vision, then, are not the best way to proceed. That vision became such a dominant source of insight at all was partly incidental, it is being said, that it was the system scientists understood best, with clear applications to image-based machine learning tasks.

Algorithms have been developed based on the fly olfactory circuit, in hopes of improving machine learning techniques for similarity searches and novelty detection tasks.

However, various type of stimulus do not become processed in the same way. Vision and olfaction are very different types of signals. So there may be different strategies to deal with different types of data. There could be much more to learn beyond studying how the visual system works.

It is suspected that the olfactory circuits of insects may well hold some of the aforestated lessons. Olfaction research did not take off until the 1990s, when the biologists discovered the genes for odour receptors. Since then, however, the olfactory system has become particularly well characterised, and it is something that can be studied easily in flies and other insects. It is tractable in a way that visual systems are not for studying general computational challenges.

“Work on olfaction is done because it is a finite system that one can characterise relatively completely.

Humans can achieve a great deal with vision, so therefore a great deal can be achieved with olfaction, too.

Olfaction differs from vision in many ways. Odours are unstructured, they do not have edges, they are not objects that can be grouped in space. Odours are mixtures of varying compositions and concentrations, and are difficult to categorise as similar to, or different from, one another. It is therefore not always clear which features should get attention.

These odours are analysed by a shallow, three-layer network that is considerably less complex than is the visual cortex. Neurons in olfactory areas randomly sample the entire receptor space, not specific regions in a hierarchy. They employ a kind of ‘antimap.’ In a mapped system such as the visual cortex, the position of a neuron reveals something regarding the type of information it carries. But in the antimap of the olfactory cortex, this is not the case. Instead, information is distributed throughout the system, and reading that data involves sampling from a  minimum number of neurons. An antimap then, is achieved via what is known as a sparse representation of information in a higher dimensional space.

If one considers the olfactory circuit of the fruit fly; 50 projection neurons receive input from receptors that are each sensitive to different molecules. A single odour will excite many different neurons, and each neuron represents a variety of odours. This is  a mess of information, of overlapped representations, and at at this point represented in a 50-dimensional space. The information is then randomly projected to 2,000 ‘Kenyon cells’, which encode particular scents. (In mammals, cells in what is known as the piriform cortex handle this function.) This constitutes a 40-fold expansion in dimension, which makes it easier to distinguish odours via the patterns of neural responses.

For example; take 1,000 people in an insufficiently large hall and try to organise them by hobby. In this crowded space, one might be able to find some way to structure these people into their groups. But now  spread them out on a football field. One has all this extra space to play around with and structure the data.

Once the fly’s olfactory circuit has achieved this, it needs to figure out a way to identify distinct odours with non-overlapping neurons. This is achieved by ‘sparsifying’ the data. Only around 100 of the 2,000 Kenyon cells – 5 percent – are highly active in response to given odours (less active cells are silenced), providing each with a unique tag.

In short, while traditional deep networks (again taking their cues from the visual system) constantly change the strength of their connections as they learn, the olfactory system generally does not appear to train itself by adjusting the connections between its projection neurons and Kenyon cells.

One interesting avenue of research has uncovered parallels between the olfactory system and a class of models referred to as support vector machines. The aim is to develop a better understanding of how olfaction works, always keeping potential AI applications in mind.

With regard to olfaction research, it was during the early 2000s, that algorithms were developed to determine how random embedding and sparsity in higher dimensions helped computational efficiency. Research led to the drawing of connections to another type of machine learning model called a ‘support vector machine’. The ways both the natural and artificial systems processed information, using random organisation and dimensionality expansion to represent complex data efficiently, were formally equivalent. Thus AI and evolution had converged, independently, on the same solution.

The interface between olfaction and machine learning is being continually explored – the search for a deeper link between the two. In 2009, an olfactory model based on insects, initially created to recognise odours could also recognise had written digits. Moreover, removing the majority of its neurons so to mimic how brain cells die and are not replaced — did not affect its performance to any significant degree. Parts of the system can go down, but the system as a whole keeps working. One implementation of this would be in say a Mars rover, which has to operate under very harsh conditions.

Recently it has been the case that a revisiting of the biological structure of olfaction has been engaged so to gain insights into ways to improve more specific machine learning problems.

In using / engaging the moth olfactory system as a foundation and while comparing it to traditional machine learning models – given fewer than 20 samples, the moth-based model does recognised handwritten digits better, but when provided with more training data, the other models proved to be much stronger and more accurate. Machine learning methods are good at giving very precise classifiers, given massive data, whereas the insect model is very good at doing a rough classification very rapidly.

Olfaction appears to function more efficiently when it comes to speed of learning because, in this case, learning is no longer about seeking out features and representations that are optimal for the particular task at hand. Instead, it is reduced to recognising which of a slew of random features are useful and which are not, i.e.,to be able to train with just one click.

In effect, the olfaction strategy is almost akin to including certain basic, primitive concepts into the model, much as a general understanding of the world is seemingly hard-wired into human brains. The structure itself is then capable of some simple, innate tasks without instruction.

A striking example of this came from a desire to find an olfaction-inspired way to perform searches on the basis of similarity. Just as YouTube can generate a sidebar list of videos for users based on what they are currently watching, organisms must be able to make quick, accurate comparisons when identifying odours. A fly might learn early on that it should approach the odour of a ripe banana and avoid the smell of vinegar, but its environment is complex and full of noise, it is never going to experience the exact same odour again. When it detects a new smell, then, the fly needs to figure out which previously experienced odours the scent most resembles, so that it can recall the appropriate behavioural response to apply.

An olfactory-based similarity search algorithm has been applied to data sets of images. The algorithm performed better than, and sometimes two to three times as well as, traditional non-biological methods involving dimensionality reduction alone. (In these more standard techniques, objects were compared by focusing on a few basic features, or dimensions.) The fly-based approach also used about an order of magnitude less computation to get similar levels of accuracy, So it either won in cost or in performance.

It’s an interesting nexus point. An entry point into thinking about next-generation neural nets.

An essentially untrained network could already be useful for classification computations and similar tasks. Building in such an encoding scheme leaves the system poised to make subsequent learning easier. It could be used in tasks that involve navigation or memory, for instance — situations in which changing conditions (say, obstructed paths) might not leave the system with much time to learn or many examples to learn from.

Research is currently underway, i.e., an olfactory model to make decisions regarding how to navigate a familiar route from a series of overlapped images.

Similarly olfaction-based method for novelty detection, the recognition of something as new even after having been exposed to thousands of similar objects during the past.

How the olfactory system processes mixtures – possibilities for applications to other machine learning challenges. For instance, organisms perceive some odours as a single scent and others as a mix: A person might take in dozens of chemicals and know she has smelled a daffodil, or she might sense the same number of chemicals from a nearby bakery and differentiate between coffee and chocolate. Separable odours are not perceived at the same time, rather, the coffee and chocolate odours are processed very rapidly in alternation.

This insight is useful for AI, too. The cocktail party problem, for example, refers to how difficult it is to separate numerous conversations in a noisy setting. Given several speakers in a room, an AI might solve this problem by cutting the sound signals into very small time windows. If the system recognised sound coming from one speaker, it could try to suppress inputs from the others. By alternating like that, the network could disentangle the conversations.

This type of research can be taken one step further by creating what is referred to as ‘insect cyborgs.’ Using the outputs of the moth-based model as inputs of a machine learning algorithm, and leading to improvements in the system’s ability to classify images. The machine learning algorithm gives much stronger material to work with, some different kind of structure is being pulled out by the moth brain, and having that different kind of structure helps the machine learning algorithm.

Current studies in olfaction hope to discover how multiple forms of learning can be coordinated in deeper networks and currenty have covered only a small part.

Odours can be mapped onto a hyperbolic space. Such an insight could indeed inform how best to structure the input data fed to deep learning systems.

A beginning point could lie not only in implementing olfaction-based architecture but also in figuring out how to define the system’s inputs –  seeking a way to describe smells. Images are more or less similar depending on the distances between their pixels in a type of visual space. But that kind of distance does not apply to olfaction. Nor can structural correlations provide a reliable bearing. Odours with similar chemical structures can be perceived as very different, and odours with very different chemical structures can be perceived as similar.

One can define odour differently, vis a vis molecules, in terms of how often the are found together in nature. Then create a map by placing odour molecules closer together if they tended to co-activate, and farther apart if they do so more rarely. It has been discovered that just as cities map onto a sphere (the Earth), the odour molecules map onto a hyperbolic space, a sphere with negative curvature..

Feeding inputs with hyperbolic structure into machine learning algorithms could help with the classification of less-structured objects. There is starting an assumption in deep learning that the inputs should be done in a Euclidean metric. One could try changing that metric to a hyperbolic one. Perhaps such a structure could further optimise deep learning systems.

Currently much of this does remains theoretical. More difficult machine learning problems need to be devised so to determine whether olfaction-inspired models stand to make a difference. This is all still emerging.

What gives researchers hope is the striking resemblance the olfactory system’s structure bears to other regions of the brain across many species, particularly the hippocampus, which is implicated in memory and navigation, and the cerebellum, which is responsible for motor control. Olfaction is an ancient system dating back to chemosensation in bacteria, and is used in some form by all organisms so to explore their environments.

It appears to be closer to the evolutionary origin point of all the things we call cortex in general. Olfaction might provide a common denominator for learning. The system provides a really conserved architecture, one that is used for a variety of things across a variety of organisms. There must be something fundamental here that is useful for learning. 

The olfactory circuit could act as a gateway to understanding the more complicated learning algorithms and computations used by the hippocampus and cerebellum , and to figuring out how to apply such insights to AI. Researchers have begun turning to cognitive processes such as attention and various forms of memory in hopes that they might offer ways to improve current machine learning architectures and mechanisms. But olfaction might offer a simpler way to start forging those connections. It is an interesting nexus point, an entry point into thinking about next-generation neural nets.

.

Secret Intelligence Service

Artificial Intelligence Primer

The Future State Project

(C-I)

.

Return to the Project

.

.

Adversitate. Custodi. Per Verum

.