MANAS—Make a Better Metaverse (2)

MANAS: The Key to Digital Synesthesia

AI is making its transition from perceptual intelligence to cognitive intelligence. The current consensus in the industry is that the development of AI has three stages: computational intelligence, perceptual intelligence and cognitive intelligence. Computational intelligence is a machine’s capacity to store information and compute. The weiqi match between AlphaGo and Lee Sedol is a perfect example of AI’s computational intelligence. AI’s ability in this aspect has far exceeded that of human beings. Perceptual intelligence is a machine’s ability to listen, talk, observe and recognize. In this aspect, AI is almost on par with humanity. AI voice recognition is already capable of imitating human voice perfectly and understanding a dozen different languages. Face recognition can isolate one person from a group of hundreds or even thousands of people. All this means that AI is at least equal to human beings in its perceptual capacity. Not to mention that in certain areas it clearly does a better job. Cognitive intelligence is a machine’s capacity to use its understanding of human languages and common sense to deduce and reason. It is an advanced level of AI revolving around the ability to do step-by-step deduction and make accurate judgments using common sense. Today, AI’s cognitive intelligence is still far behind what humans are capable of. In the future, with breakthroughs in brain-machine interface, AI may finally be able to match humans in cognitive intelligence.

In the Metaverse, users will be able to perceive the digital world through five senses. To make things easier to understand, we call this synesthesia. The original meaning of synesthesia is to break the boundaries between sight, smell, touch, hearing and taste and make these senses interchangeable so that one’s perception is richer and fuller. In the Metaverse, digital synesthesia is achieved by AI and machines enabling our bodies to experience the virtual world more vividly than is possible in the physical reality, to express and receive the warmth of emotions. Language is at the bottom layer of digital synesthesia. Hearing is the basis of verbal interactions. While digital sight (VR) and touch (sensory equipment) enable our bodies to feel the texture of the virtual world. The integration and synergy of all these virtual senses will make for a virtual experience surpassing what can be experienced in reality.

Therefore, the development speed of AI, especially that of cognitive intelligence will directly affect how fast the Metaverse matures. As a newborn concept, the Metaverse doesn’t yet have a clear definition, but it does have some distinguishing features, which include concrete digital identity, multi-model perception immersiveness, low latency, a multi-dimensional virtual world, accessibility from anywhere, as well as a complete economic and social system. Due to these features, the Metaverse will create large quantities of data and consume large quantities of computing power. High bandwidth networks, Metaverse devices everywhere in all shapes and forms, as well as natural and intuitive human-machine interactions, will demand smarter algorithms to process in order to keep up with the growth of the Metaverse and its number of users. The success of the Metaverse, in particular, will depend on human-machine interaction, which has to do with user experience, as well as large-scale dataset processing, which affects the governance model of the Metaverse. Breakthroughs in these two areas will depend on AI. Therefore, how fast AI develops will affect the progress of digital synesthesia, which in turn will decide how soon the Metaverse will take off.

MANAS will provide the necessary solutions for a better digital synesthesia for users.

MANAS—Make a Better Metaverse (1)

MANAS—Make a Better Metaverse (3)

MANAS—Make a Better Metaverse (4)

Twitter: https://twitter.com/MatrixAINetwork/status/1532612062601433088?s=20

Medium: https://matrixainetwork.medium.com/manas-make-a-better-metaverse-2-67c689322e9b