Research/Blog
#CellStratAILab #disrupt4.0 #WeCreateAISuperstars #WhereLearningNeverStops Last Saturday, AI Researcher Indrajit Singh presented a marvellous workshop on “Bidirectional Encoder Representations from Transformers” – also called “BERT” in short. We are in the era of Pre-trained models in AI. The figure below shows tons of pre-trained models which have taken root. BERT is one of the core models […]
#CellStratAILab #disrupt4.0 #WeCreateAISuperstars #WhereLearningNeverStops Recently, our AI Researcher Gouthaman Asokan presented a superb session on introduction to Graph Neural Networks (GNNs). Structured vs Unstructured Data :- Images and text are structured data. Convolutional Networks, Recurrent Neural Networks, Autoencoders work well on the structured data as they can be converted to the matrix or vector like format. […]
#CellStratAILab #disrupt4.0 #WeCreateAISuperstars #WhereLearningNeverStops Last Saturday, our AI Lab Researcher Indrajit Singh presented an exhaustive webinar on Transformers, which are used in NLP. Introduction To combine the advantages from both CNNs and RNNs, [Vaswani et al., 2017] designed a novel architecture using the attention mechanism. This architecture, which is called as Transformer, achieves parallelization by […]
Last Saturday 9th May ’20, our AI Lab researchers presented some superb sessions. Credit Risk Modelling with ML :- First, RameshBabu L. presented an excellent session on Credit Risk Modelling. Credit risk modeling refers to data driven risk models which calculates the chances of a borrower defaults on loan (or credit card). Logistic regression method […]
#CellStratAILab #disrupt4.0 #WeCreateAISuperstars #AlwaysUpskilling Last Friday, our AI Lab Researcher Darshan C G presented a wonderful overview of “GPU Architectures and General-Purpose GPU Computing“. Parallelism :- Performance Improvements are often based on parallelism techniques, which are found everywhere :- Pipelining, Instruction-Level Parallelism. Vector Processing. Array processors/MPP. Multiprocessor Systems. Multicomputers/cluster computing. Multicores. Graphics Processing Units (GPUs) […]
#CellStratAILab #disrupt4.0 #WeCreateAISuperstars #AlwaysUpskilling Last Saturday 2 May 2020, our AI Lab Researcher Indrajit Singh presented a fabulous session on “Sequence to Sequence Learning and Attention in Neural Networks“. Sequence to Sequence Model :- Sequence-to-sequence learning (Seq2Seq) is about training models to convert sequences from one domain (e.g. sentences in English) to sequences in another […]
#CellStratAILab #disrupt4.0 #WeCreateAISuperstars #AlwaysUpskilling Last Saturday (25th Apr ’20), our AI Lab Researcher Darshan G. presented a fabulous hands-on workshop on Tensor Fusion, Accelerated Linear Algebra (XLA) and many other techniques to speed up Deep Learning computations. The following discussion covers many of these techniques. TensorFlow Graph Concepts :- TensorFlow (v1.x) programs generate a DataFlow […]
This post discusses temporal difference (TD) methods, used in Reinforcement Learning. It contrasts TD methods with Monte Carlo (MC) methods and dynamic programming. You need to have a thorough understanding of Markov Decision Process (MDP) to understand this post. Prediction and Control : In general, RL methods have two components 1) Prediction / Evaluation : where […]
This post assumes that you have a strong understanding of the basics of Reinforcement Learning, MDP, DQN and Policy Gradient Algorithms. You can go through Policy Gradients to understand the derivation for Stochastic Policies In the previous post on Actor Critic, we saw the advantage of merging Value based and Policy based methods together. The […]
#CellStratAILab #disrupt4.0 #WeCreateAISuperstars Last Saturday, AI Researcher Indrajit Singh presented a superb workshop on Dependency Parsing used in NLP. The topics covered in this workshop included :- Understand Dependency Parsing Syntactic Structure: Consistency and Dependency Dependency Grammar and Treebanks Transition-based dependency parsing Dependency Parsing involves detecting which words depend on which other words. Dependencies are […]