Source: https://www.linkedin.com/feed/update/urn%3Ali%3Ashare%3A6864355582430777344
Institution of Engineering and Technology (IET) #AI #Transformers #DNN #Deep #Neural #Networks: Greedy tech gives resource problems: https://lnkd.in/eMdxHYXR : The development of a #neural #structure known as a #Transformer has caused #DNN’s growth to go into overdrive. By the time of the spring conference this year that line had moved up to 40-fold, driven by the unveiling of #OpenAI’s #GPT-3 followed by the #Google #Switch, with a maximum of 1.6 trillion trainable #parameters. Though they were developed primarily to handle #text, #Transformers have turned out to be surprisingly general-purpose and are now being used to replace the simpler #convolutional #layers in more #conventional DNNs used to look at #images. IBM Research has deployed them in a model that #predicts plausible ways to #synthesis novel #chemicals.
Though #Transformers don’t think like us, you could continue the #scaling and try to overcome the problem of #DNN’s making mistakes when presented with #something #unexpected by giving them access to #all the #data they might ever encounter and enough #space to store the learned #parameter.
For #neuralnetwork #training, the next frontier may lie in #rethinking the #backpropagation #algorithm that gave #deeplearning its new life in the 2010s. One major puzzle for AI specialists is why it has proven so successful at producing results given that neurons just are not connected in a way that makes it possible. Work on #alternative #architectures such as #spiking #neural #networks as well as #hybrids that break up #backpropagation into more manageable chunks may deliver more responsive and efficient #neural #networks.
#Work expands so as to fill the [computing] #time [& #space] available for its completion. – #Parkinson’s #law [corollary]
VentureBeat State of AI Report tracks #transformers in #critical #infrastructure: https://lnkd.in/eNsUnYnX :
We see the #attention-based #transformers #architecture for #machinelearning #models branch out from #natural #language #processing to #computer #vision applications. Google made that come true with its #vision #transformer, known as #ViT. The approach has also shown success with #audio and #3D point cloud models and shows potential to grow as a general-purpose #modeling tool. Transformers have also demonstrated superior performance at predicting chemical reactions, for example — the UK’s National Grid utility significantly halved the #error in its #forecasts for #electricity #demand using a type of transformer.
2021 #State of #AI #Report: Download: https://www.stateof.ai/ :
Global Risk Management Network LLC: Silicon Valley-Wall Street-Pentagon-Global Digital CEOs Networks
CEOs-CxOs: We create the Digital Future™: https://lnkd.in/esk8PEp
You Can Too!: Start with A, B, C:
AIMLExchange.com : BRINT.com: C4I-Cyber.com :
https://lnkd.in/eUb6Z-ES :
You May Have All the Solutions, We Have All Your Customers™: