Recently, Dogecoin DOGE/USD founder Billy Marcus tweeted, "Would you be friends if you could upload your brain to the cloud and talk to a virtual version of yourself?" Musk replied, "I've already done it".
As we all know, virtual people have now swept the world, accounting for more traffic than first-line stars. But such a popular virtual person still has no thoughts, and some are just a few body movements and recognition ability. Even the "Tie Da" launched by Xiaomi is just a piece of iron. Although it can perceive people's emotions through recognition, it cannot think. So is this still far from the "intelligent" robot in our hearts?
The reason why people are called people must be because they have thinking, and the brain-computer interface may be the intelligent brain library of the next generation of virtual people or intelligent robots. We don’t need manual input or voice input, we can communicate with the robot directly through the EEG cap, giving them the purpose of thinking and behavior, and directing their body, just like using our own body. But what is a brain-computer interface?
Brain-computer interface Brain-machine interface, sometimes called "brain port" direct neural interface or "brain-machine fusion perception" brain-machine interface, uses external devices such as computers, electrodes, chips, etc. to replace conventional intermediaries such as nerves and muscles to realize the connection between the brain and the outside world. The new communication control technology for information interaction is a new technology that subverts the traditional human-computer interaction. Brain-computer interface technology integrates multi-disciplinary knowledge such as brain science, neuroscience, information science, material science, biological science, systems science, and medical engineering, and integrates the biological brain with artificial intelligent equipment systems. Perceive and translate consciousness, and realize the zero-distance information exchange between machines and humans. In the case of a one-way brain-computer interface, the computer either accepts commands from the brain or sends signals to the brain (such as video reconstruction), but cannot send and receive signals at the same time. A two-way brain-computer interface allows two-way information exchange between the brain and external devices.
In recent years, Neuralink, a brain-computer interface company invested by Elon Musk, has attracted much attention. In 2019, Neuralink released microchips and surgical robots; in 2020, Neuralink experimented on animals for the first time, on the heads of three little pigs Chips and electrodes were implanted to successfully read EEG signals; in 2021, Neuralink announced that it hopes to use brain-computer interface technology to make paralyzed patients stand up. Compared with Blake Lemoine, a Google AI ethics researcher who previously claimed that LaMDA is conscious like a human, the brain-computer interface can truly read human consciousness, and the neural network can learn human thinking consciousness and emotion from the EEG signals collected by the EEG cap. Discrimination or behavior, thereby further becoming a "smart person".
Wide range of applications
1. Medical Field
Compared with other physiological signals, EEG signals can provide more in-depth and authentic emotional information. Through the deep learning algorithm, the characteristics of EEG signals can be extracted, which can realize the judgment and analysis of various emotions (such as sadness, anger, fear, surprise, joy, calm, etc.), so as to assist in the treatment of mental diseases such as depression and anxiety. and study its pathogenesis.
In terms of mental illness rehabilitation treatment, neurofeedback training based on brain-computer interface can play an active role in the treatment of depression and anxiety disorders. For example, Musk’s Neuralink company is researching brain-computer interface technology to solve mental diseases such as schizophrenia and memory loss.
Domestically, in December 2020, the Brain-Computer Interface and Neuromodulation Center of Ruijin Hospital Affiliated to Shanghai Jiaotong University School of Medicine was officially established. Research" was launched simultaneously to treat treatment-resistant depression through a multimodal affective brain-computer interface and deep brain stimulation. For example, the whole picture below is the whole process of EEG emotion recognition: the left is the 64-lead EEG cap, the middle is the graph convolutional neural network, and the right is the recognized emotion label.
In addition, the brain-computer interface can also be applied to the treatment of limb movement disorders, epilepsy and other diseases. The improved efficacy of brain-computer interfaces can be used in the treatment of diseases such as traumatic brain injury, stroke and epilepsy. The recovery effect of brain-computer interface can be used for rehabilitation training of upper and lower limb motor dysfunction after stroke, promoting neuroplasticity of motor cortex through active rehabilitation training, and can also be used for the recovery of diseases such as muscle weakness and spinal cord injury. Alternative efficacy of brain-computer interfaces can provide patients with amyotrophic lateral sclerosis, locked-in syndrome, myasthenia gravis, or aphasia with complementary tools to help their daily lives, such as brain-controlled prosthetics, brain-controlled wheelchairs, and brain-computer communication system, etc.
2. Military Field
According to research by the US Department of Defense, brain-computer interfaces have potential application value in the military field. With the help of the brain-computer interface, the military can better conduct secret communications, effectively prevent radio espionage interference, and reach a consensus on strategic deployment and operational policies. At the same time, the brain-computer interface can remotely use consciousness to know the operation of fighter planes or tanks, reducing the cost of casualties caused by frontal charges. From this point of view, the new pattern of future warfare will be broken by brain-computer interface technology.
3. Education Field
At present, some researchers study the neurophysiological activities of students in learning situations, combine machine learning and deep learning and other methods, try to quantitatively describe students' learning states such as attention level and mental load, so as to provide each student with a customized personality A comprehensive learning plan, truly "teaching students in accordance with their aptitude". At the same time, an attention detection and early warning system for online learning based on a brain-computer interface can be created, and it is found that the application of the attention detection and early warning system significantly improves the academic performance of students.
4. Entertainment Field
In terms of entertainment games, brain-computer interface technology can be combined with virtual reality (VR) technology to collect brain signals through sensors worn on the player's scalp, and then transmit the signals to the computer, where the decoding algorithm converts the signals into the signals needed in the game. The executed instructions can be used to play games with "ideas", which truly realizes an immersive and empathetic gaming experience. At the same time, the brain-computer interface is very friendly to the disabled and can expand the game customer base.
The application of brain-computer interface in various fields is the cornerstone of the realization of real intelligent robots with thinking. When these cornerstones are paved high enough, perhaps a huge conscious intelligent robot will be born.
About Magic Data
Magic Data is a global AI data solutions provider with 3 core products: MD training datasets, Annotator platform, and data collection and annotation services.
Founded in 2016, Magic Data is dedicated to provide high quality text, audio, image, and multi-modal training datasets to the AI industry. The company now has accumulation of over 400 licensable datasets that are ready-to-use for machine learning, including over 200,000 hours speech data for ASR model and over 3,000,000 sentences of text data for NLP.
Magic Data is ISO/IEC 27001 & ISO/IEC 27701:2019 accredited and GDPR compliant.
For more information, visit www.magicdatatech.com.