How a Brain-inspired Theory May Solve Problems of Big Data

We think of supercomputers as being super-powerful. But they can be overwhelmed by the floods of information now produced in science, commerce, government, meteorology, social media, and other areas.

And compared with the human brain, they gobble up lots of energy and take up lots of space. But the human brain may provide some answers In the shape of ANSI machine , based on these theory of Intelligence, which Is Itself based on research Into the workings of brains and nervous systems. This new thinking about “Big data and the SP theory of intelligence”?developed by Dry Gerry Wolff of Countersignatures. Org? has now been published in the Journalize Access [Note 1]. “We can save a lot of energy by using probabilities” said Dry Wolff.

Instead of doing computations mechanically, we can concentrate our efforts where answers are most likely to be found. We don’t need some special process for gathering information about probabilities because, as a by-product of how the SP system works, it creates a statistical model of Its environment. ” Big savings may also be possible In transmitting things Like TV programmers. With some further development, the SP system may learn general rules and patterns from that kind of information.

If the transmitter of a TV programmer, and TV sets, all know hose rules and patterns, then a TV programmer can be transmitted economically by sending only the parts that are different from the general rules and patterns. The SP system may help to bring some order into the chaos of different ways in which knowledge is represented in computer systems. In just one area?the representation of images?there are many different formats?JEEP, TIFF, WHOM, BUMP, GIF, PEPS, PDF, PANG, IBM, and more?and each one has Its own special mode of processing. This Jumble of different formalisms and formats for knowledge Is a great implication In the processing of big data, especially In processes for the discovery or learning of structures and associations In big data. ” said Dry Wolff. The SP system may help to simplify things by serving as a universal framework for the The SP system may also help in such things as recognizing patterns in big data, reasoning about big data, and in presenting structures and processes in visual forms that would help people understand big data. A useful step forward in developing these ideas would be the creation of a hagiographer version of the SP machine” said Dry Wolff. This would be based directly on the existing SP computer model, it would be hosted on an existing historiographer computer, and it would provide a means for researchers everywhere to see what can be done with the system and to create new versions of it.