The company partnering with Google to achieve nuclear fusion talks AI with Access-AI
Posted: 26 July 2017 | By Charlie Moloney
Tri Alpha Energy (TAE) is a company leading the way in researching Nuclear Fusion, the process of combining atoms at extreme temperatures which is largely believed to be the secret to creating an endlessly renewable source of clean energy, and they’re using artificial intelligence (AI) to ramp up their efforts.
Support has rolled in, in the form of $500m in investment, backing from Microsoft co-founder Paul Allen, and now a new partnership with Google Research to create an advanced algorithm for solving complex problems, as reported by the Guardian on Tuesday.
But how is AI featuring in this potentially world changing research? We know that you want the specific details of which AI technologies are being used and how they are pushing results through the roof – so we found out for you!
Michl Binderbauer, President and CTO of TAE, responded to a press request from Access-AI, and here’s what he had to say:
We use various AI techniques across our work. They are mostly experimental data driven. We use mainly machine learning and neural nets.
The substance of the work reported with Google was primarily directed towards improving optimization of particular machine conditions, something that has greatly sped up our ability to do sequences of experimental studies in less time – what used to take a month can now be done in about an afternoon.
It is hard to characterize or predict the exact nature of the boost coming from the use of AI. It is certainly dramatic and as we understand more about its benefits and ability to enhance our work, it will further speed up learning, discovery and delivery of goals.
What used to take a month can now be done in about an afternoon
Perhaps more importantly, efficient and ultimately successful control and feedback may not be possible without the use of AI – so AI is akin to a necessary condition.
More data also helps to train algorithms and systems. To that end we have an extremely large dataset already – over 50,000 experiments on our 3rd and 4th generation machines. Large, consistent, and complete data sets are essential.
However, given the evolving nature of our machines and their ability, there are gaps and changes in the structure and completeness of the data.
Therefore, one challenge and key focus of our AI efforts is to make sure that we develop algorithms and techniques that are adaptable and capable to operate with noisy and variable data.