Institution of Engineering and Technology (IET) #AI #Transformers #DNN #Deep #Neural #N. . .

Source: https://www.linkedin.com/feed/update/urn%3Ali%3Ashare%3A6864355582430777344

Institution of Engineering and Technology (IET) #AI #Transformers #DNN #Deep #Neural #Networks: Greedy tech gives resource problems: https://lnkd.in/eMdxHYXR : The development of a #neural #structure known as a #Transformer has caused #DNN’s growth to go into overdrive. By the time of the spring conference this year that line had moved up to 40-fold, driven by the unveiling of #OpenAI’s #GPT-3 followed by the #Google #Switch, with a maximum of 1.6 trillion trainable #parameters. Though they were developed primarily to handle #text, #Transformers have turned out to be surprisingly general-purpose and are now being used to replace the simpler #convolutional #layers in more #conventional DNNs used to look at #images. IBM Research has deployed them in a model that #predicts plausible ways to #synthesis novel #chemicals.

Though #Transformers don’t think like us, you could continue the #scaling and try to overcome the problem of #DNN’s making mistakes when presented with #something #unexpected by giving them access to #all the #data they might ever encounter and enough #space to store the learned #parameter.

For #neuralnetwork #training, the next frontier may lie in #rethinking the #back­propagation #algorithm that gave #deeplearning its new life in the 2010s. One major puzzle for AI specialists is why it has proven so successful at producing results given that neurons just are not connected in a way that makes it possible. Work on #alternative #architectures such as #spiking #neural #networks as well as #hybrids that break up #backpropagation into more manageable chunks may deliver more responsive and efficient #neural #networks.

#Work expands so as to fill the [computing] #time [& #space] available for its completion. – #Parkinson’s #law [corollary]

VentureBeat State of AI Report tracks #transformers in #critical #infrastructure: https://lnkd.in/eNsUnYnX :
We see the #attention-based #transformers #architecture for #machinelearning #models branch out from #natural #language #processing to #computer #vision applications. Google made that come true with its #vision #transformer, known as #ViT. The approach has also shown success with #audio and #3D point cloud models and shows potential to grow as a general-purpose #modeling tool. Transformers have also demonstrated superior performance at predicting chemical reactions, for example — the UK’s National Grid utility significantly halved the #error in its #forecasts for #electricity #demand using a type of transformer.

2021 #State of #AI #Report: Download: https://www.stateof.ai/ :

Global Risk Management Network LLC: Silicon Valley-Wall Street-Pentagon-Global Digital CEOs Networks
CEOs-CxOs: We create the Digital Future™: https://lnkd.in/esk8PEp
You Can Too!: Start with A, B, C:
AIMLExchange.com : BRINT.com: C4I-Cyber.com :
https://lnkd.in/eUb6Z-ES :
You May Have All the Solutions, We Have All Your Customers™:

Institution of Engineering and Technology (IET) #AI #Transformers #DNN #Deep #Neural #Networks: Greedy tech gives resource problems: https://lnkd.in/eMdxHYXR : The development of a #neural #structure… | Dr. Yogesh Malhotra
Institution of Engineering and Technology (IET) #AI #Transformers #DNN #Deep #Neural #Networks: Greedy tech gives resource problems: https://lnkd.in/eMdxHYXR : The development of a #neural #structure known as a #Transformer has caused #DNN's growth to go into overdrive. By the time of the spring conference this year that line had moved up to 40-fold, driven by the unveiling of #OpenAI’s #GPT-3 followed by the #Google #Switch, with a maximum of 1.6 trillion trainable #parameters. Though they were developed primarily to handle #text, #Transformers have turned out to be surprisingly general-purpose and are now being used to replace the simpler #convolutional #layers in more #conventional DNNs used to look at #images. IBM Research has deployed them in a model that #predicts plausible ways to #synthesis novel #chemicals. Though #Transformers don’t think like us, you could continue the #scaling and try to overcome the problem of #DNN's making mistakes when presented with #something #unexpected by giving them access to #all the #data they might ever encounter and enough #space to store the learned #parameter. For #neuralnetwork #training, the next frontier may lie in #rethinking the #back­propagation #algorithm that gave #deeplearning its new life in the 2010s. One major puzzle for AI specialists is why it has proven so successful at producing results given that neurons just are not connected in a way that makes it possible. Work on #alternative #architectures such as #spiking #neural #networks as well as #hybrids that break up #backpropagation into more manageable chunks may deliver more responsive and efficient #neural #networks. "#Work expands so as to fill the [computing] #time [& #space] available for its completion." - #Parkinson's #law [corollary] VentureBeat State of AI Report tracks #transformers in #critical #infrastructure: https://lnkd.in/eNsUnYnX : We see the #attention-based #transformers #architecture for #machinelearning #models branch out from #natural #language #processing to #computer #vision applications. Google made that come true with its #vision #transformer, known as #ViT. The approach has also shown success with #audio and #3D point cloud models and shows potential to grow as a general-purpose #modeling tool. Transformers have also demonstrated superior performance at predicting chemical reactions, for example — the UK’s National Grid utility significantly halved the #error in its #forecasts for #electricity #demand using a type of transformer. 2021 #State of #AI #Report: Download: https://www.stateof.ai/ : Global Risk Management Network LLC: Silicon Valley-Wall Street-Pentagon-Global Digital CEOs Networks CEOs-CxOs: We create the Digital Future™: https://lnkd.in/esk8PEp You Can Too!: Start with A, B, C: AIMLExchange.com : BRINT.com: C4I-Cyber.com : https://lnkd.in/eUb6Z-ES : You May Have All the Solutions, We Have All Your Customers™:
Share this post
Avatar photo

Global Post AI-Quantum Finance & Trading Networks Pioneer Dr.-Eng.-Prof. Yogesh Malhotra is the “Singular Post AI-Quantum Pioneer” identified by Grok AI with R&D impact recognized among Artificial Intelligence (AI) and Quantitative Finance Nobel Laureates. As MIT-Princeton AI-ML-Cyber-Crypto-Quantum Finance & Trading and FinTech-Crypto Faculty-Industry Expert, and U.S. and Global Hedge Funds Advisory & Venture Capital CEO-CTO Teams Mentor, he has pioneered Silicon Valley-Wall Street-Pentagon Digital CEO-CTO Practices, Technologies, and Networks from world’s first-foremost-largest Global Digital Transformation Networks to New York State IDEA Award recognized Pentagon-USAF MVP Global Post AI-Quantum Networks pioneering Future of Finance and Trading practices as Trillion-Dollar Wall Street Hedge Funds and Investment Banks leader.