The VR simulation that we use to study a subject’s progress works using our unique Joint Working System (JWS). The JWS brings multiple systems together to give us a full picture of the subject’s subconscious behavioural tendencies. Here we’re going to look at this system and how it has been developed.
The Origins Of The DNN – DGN Interface
Researchers have attempted to decode brain activity as a means to recreate images, words, and even video for many years now, first finding success in a limited capacity in 2011. Many of these early methods saw computers being able to ‘categorize’ brain activity in a very broad sense. This meant recognising that the subject was thinking about a bird, a face, or a word, but not being able to elaborate beyond this.
The first major breakthrough came in 2018 when researchers in Kyoto University, Japan, came up with a system that allowed computers to physically recreate images in real time. This method saw the use of functional magnetic resonance imaging (fMRI) to gather brain activity data. This was then fed into a complex multi-layered AI designed to mimic the hierarchical way our brains process visual information.
This AI, known as a deep neural network (DNN), would then provide output data that could be matched to raw data taken from initial fMRI scans of the test subjects. The team then fed these results through a further AI, a deep generator network (DGN) designed specifically to pinpoint readings relating to dominant features such as eyes or textures. The result was the team being able to create images that were far closer to those in the test subject’s mind.
Progress Made With The Ailuros Project
It should be noted that the images were not 100% accurate, with shapes often blurred or slightly altered in terms of positioning. However, this research provided an important foundation for creating the system that we use in the Ailuros System. In particular, the combination approach of using a DNN and a DGN to imitate the way the human brain acts in response to visual stimulus is a system that has been polished rather than altered. This has allowed for far greater accuracy, not only in terms of image recreation, but in reading sentences that test subjects are thinking about, and registering movement in dreams.
The biggest change we have made is switching from using fMRI to functional near-infrared spectroscopy (fNIRS) technology. fMRI readings required the use of large machines to take the readings, and so the issue of space and portability was a major hurdle in terms of any large-scale roll-out. Initially, the switch came with its own problems, mostly centred on its lower spatial resolution and shallow scan penetration depth compared to fMRI.
Our team was able to create our first modified helmet in early 2020. This allowed not only an increased spatial resolution, but increased the penetration depth. This was achieved by the use of a mix of modified equipment to increase the strength of the near-infrared light, bringing it closer to the pure infrared range, and custom software to increase imaging quality.
Prolonged exposure to the modified fNIRS still had limitations due to the inherent risk of tissue damage, however, this is something that we were able to combat easily. The Neg-Vac samples we use have few side effects in all but the most severe cases. They do, however, make subjects more susceptible to certain stimuli, at least for the first day after use. This means that, by keeping a robust record of each person’s original scan data, we can essentially play the Alleviation Sim at high speed, thusly ensuring fNIRS helmet use is limited.
Data And The Rule Of Seven
In the Ailuros Project, we use what we refer to as the ‘Rule of Seven’. This means that each reading is run through the DNN seven times, with the DNN algorithm altered slightly each time. This allows us to take potential mis-readings in the initial fNIRS, slight alterations in the subject’s brain activity, and AI error into account. This also means that the DGN has a far greater amount of data to use when bringing the final pictures together.
In terms of images, these readings form what we refer to as Visual Population Data (VPD). The VPD is then used to populate the VR sim with characters inhabiting the subject’s subconscious mind. It also then allows us to study the way they picture events and physical responses, providing important data as to how they see the world and their potential actions.
The fNIRS readings are also cross-referenced with data gathered in relation to muscle activity. This forms what we refer to as Movement Population Data (MPD). While the MPD is used to guide the test subject through the VR sim in respect of how they traverse the environment, it actually has another application. In cases where the subject assumes multiple roles in the sim, noting which character or action receives stronger readings in this sense can often determine what behaviours the subject more strongly relates to.
The combined EMG-fNIRS used during a subject’s Calibration Day also lets us gather data relating to language. By combining our initial scan data with the readouts acquired in real-time, we are able to create Speech Population Data (SPD). This SPD is used to provide virtual verbal responses to stimulus, both self-created by the subject, and in response to auditory nudges provided by the project. This is further refined by using an AI trained to recognise fNIRS readouts relating to speech in a number of different people. This AI can then ‘advise’ the sim in real-time as to the commonality between the real-time readings, the subject’s sound database, and multiple historical readings.
As you can see, the JWS contains multiple moving parts. Primarily though, this can be viewed as a collaboration between real-time scans and a number of AIs. As such, it could be argued that the AI parts of the Ailuros System are as much members of the team as the humans monitoring the subjects. This is more than a collaboration between humans and technology, however. It is also a collaboration between the team and the members of the public who wish to work towards a crime free future. That being the case, all those who take part in the initiative as also important parts of the Joint Working System that is the Ailuros Project. Together, we can make the world a safer place.