Graph Neural Networks, Memory Networks
#CellStratAILab #disrupt4.0 #WeCreateAISuperstars #AlwaysUpskilling #CellStratPrime
Minutes from Saturday 22nd Feb 2020 AI Lab meetup at BLR :-
Last Saturday our AI Researchers presented some amazing algorithms in the AI Lab.
Graph Neural Networks :-
First Pushparaj M. presented an excellent seminar on Graph Neural Networks (GNN).
A Graph is a set of nodes (vertices) connected by directed or undirected connections (edges). Examples include friends network in Facebook or LinkedIn or the World Wide Web, or event cities in a transportation system. Graphs represent real world connections between entities.
Now the problem statement becomes can we find useful insights from graphs – indeed we can, using ML or DL algorithms.
However to apply ML to Graphs, we must transform the Graph to a suitable data structure. We do this by converting the Graph into an Adjacency Matrix or Adjacency List.
In the first case, each node is mapped to its directly connected nodes.
In the second case, an array of lists is used.
This Adjacency List is then fed to a deep neural network and can be used for tasks such as classification or regression. Classification tasks might include determining target customers for a firm by analyzing a social network, or even predicting which folks will vote for certain political parties.
Pushparaj demonstrated the graph neural networks with a code demo involving creation of an Adjancency Matrix and then feeding it to a DNN for classification purpose.
Memory Networks :-
Later Sujith Kamath presented a superb session on Memory Networks. Here is a synopsis on this topic :
NLP is evolving very fast and trying to solve many Business challenges. Memory Networks is latest addition to DL techniques to resolve Text data issues.
Traditional sequential neural networks such as RNN, Bi-LSTM works satisfactorily only if the data is in few sentences, they cannot memorize the context if the data is huge and articles running into many pages. Memory Networks can remember long term memory along with the context than other techniques if the problem statement is to output single word answer.
Memory Networks work by learning from Question-Answer pairs provided in a Supervised manner. They achieve remarkable accuracy in predicting short answers for long-form text.
CellStrat AI Lab continues to blaze the trail for advanced AI training and research in India. Visit our AI Lab in BLR or GGN tomorrow (29th Feb 2020) to explore our AI Lab and world-class training programs :-
BLR AI Lab :-
Register : https://bit.ly/32K48dH
Topic : Quantum Hardware Architecture, Natural Language Generation (NLG)
Date : Saturday 29th Feb 2020, 10:30 AM – 5:00 PM
Presenters : Niraj Kale, Mohana Murali Gurunathan, Indrajit Singh
Gurugram AI Lab :-
Register : https://bit.ly/3aiBvH3
Topic : ML in E-commerce, ML deployment on AWS
Date : Saturday 29th Feb 2020, 10:00 AM – 2:00 PM
Presenters : Nazish Ansari, Abhishek Kumar, Bhavesh Laddagiri
See you tomorrow for the AI Lab meetup in BLR or GGN ! Lets disrupt the world with AI !
Questions ? Call me at 9742800566 !
Co-Founder & Chief Data Scientist, CellStrat