CTGI's original 1980s research in AI revolved around flat neural nets and high speed learning algorithms that could be computationally efficient, maximizing a constrained resource. Even then the focus was on accumulating market data by the terabyte for off line learning, using agents with names like Darwin and Lamarck, and then using the derived knowledge base to tackle problems in real time. This AI patent was filed in October 1998 and granted in 2001.
Today, rather than hundreds of costly dedicated servers (our AI routing / transaction processing applications ran on 480 networked dedicated XEONS) we now have on demand parallel computation and hosted bar metal when needed. And a set of better machine learning algorithms that are increasingly powerful.
When Google did their initial research into AI brain simulation they fielded 16,000 cores simulating more than one billion neural connections and processed ten million digital images - and the neural net taught itself to ... recognize cats. Self learning AI and later using AIs as sparing partners to up their games is now going mainstream with the widespread compute and storage ... for example AlphaGo and AlphaGo Zero. In the latter Alpha Go Zero learned from scratch, in the former from game history.
So where can an AI research group differentiate itself - Data, Accuracy and Speed!
Few people realize that Tesla is an AI company building custom AI chips and accumulating billions of miles of driving data - and that the data represents an enormous competitive advantage. Combined with hand tuned software and custom processing hardware Tesla has moved the start of the art from a few frames per second to 2,100 frames per second. This isn't something you can buy off the shelf and open source - it is targeted development to resolve a specific problem. Their dual chips process a total of 72 trillion operations per second with very little energy consumption. Bandwidth is two terabytes per second (one for each of the dual chips). Their software tech compiles their neural network knowledge base to further streamline processing speed.
At Coastal we hand craft AIs that can self optimize their performance over time - which sequences of algorithms and neural nets work best with what type of traffic? To do that we consume the traffic, run models, and then let the AI evaluate its own performance against the models. And this can be applied across industry verticals - for example every time the KNWN engine authenticates a person it learns more about the person, their environment, their devices - and self improves its speed and accuracy. This isn't really programming anymore, its curating data to feed to AIs.
And what do we do when their is no readily available data or when we seek to raise the bar? Take a lesson from AlphaGo Zero and let the AIs create and extend virtualized learning sets. Or price and secure our services such that data sets are presented to us.
So now one wonders, will a Tesla be able to spot a Cat? Or can AIs continue to learn from simulation vs real world data?