Skip Main Navigation
Page Content

Save This Event

Event Saved

Advanced NLP and Temporal Sequence Processing (20 – 21 February 2020)

SGInnovate and Red Dragon AI

Thursday, February 20, 2020 at 9:00 AM - Friday, February 21, 2020 at 5:00 PM (Singapore Standard Time Singapore Time)

Advanced NLP and Temporal Sequence Processing (20 – 21...

Ticket Information

Ticket Type Sales End Price Fee Quantity
Early Bird Module 3 [Ends on 19 January 2020] (Ticket Inclusive of G.S.T)
Ticket price is inclusive of GST. If you are organisation-sponsored and require a Tax Invoice, please contact learning@sginnovate.com, instead of checking out via PayPal. Funding is on a reimbursement basis upon fulfilment of each funding criteria.
Jan 19, 2020 $1,524.75 $0.00

Share Advanced NLP and Temporal Sequence Processing (20 – 21 February 2020)

Event Details

Overview

 

Together with Red Dragon AI, SGInnovate is pleased to present the third module of the Deep Learning Developer Series. In this module, we dive deeper into some of the latest techniques for using Deep Learning for text and time series applications.

 

In this course, participants will learn: 

  • Text classification models and how to build a text classifier
  • To build a Named Entity Recogniser (NER) system
  • Sequence to sequence models
  • To build NLP models from scratch
  • To build a Chatbot Machine Learning system
  • To build a language model

Recommended Prerequisites:

About this module:

 

One of the core skills in Natural Language Processing (NLP) is reliably detecting entities and classifying individual words according to their parts of speech.  We will look at how Named Entity Recognition (NER) works and how Recurrent Neural Networks (RNNs) and Long Short-Term Memory (LSTMs) are used for tasks like this and many others in NLP.

 

Another common technique of Deep Learning in NLP is the use of word and character vector embeddings.  We will cover well-known models such as Word2Vec and GLoVE, how they are created, their unique properties, and how you can use them to improve the accuracy of Natural Language in terms of understanding problems and applications.

 

We will also cover some of the recent developments in using transfer learning for text-related problems and language modelling. These led to some of the current state-of-the-art results for text classification problems like sentiment analysis. This section will cover papers from ULMFIT, ELMo and OpenAI’s most recent Transformer model.

 

One of the most significant applications of NLP currently is the creation of chatbots and dialogue systems. Thus, in this module, you will discover how various types of chatbots work, the key technologies behind them and systems like Google’s DialogFlow and Duplex.

 

We will also look at applications such as the Neural Machine Translation. You will learn the recent developments and models that use these techniques and various types of attention mechanisms that will dramatically increase the quality of translation systems.

 

Beyond just text, this module will also cover time-series predictions and how you can use techniques from the text-based models to make predictions on sequences. This opens the range of applications to include financial time-series, continuous IoT readings, machinery failure prediction, website optimisation and trip planning.

 

As with all the other Deep Learning Developer modules, you will have the opportunity to build multiple models yourself. 

 

About the Deep Learning Developer Series:

 

The Deep Learning Developer Series is a hands-on and cutting-edge series targeted at developers and data scientists who are looking to build Artificial Intelligence applications for real-world usage. It is an expanded curriculum that breaks away from the regular eight-week long full-time course structure and allows for modular customisation according to your own pace and preference. In every module, you will have the opportunity to build your Deep Learning models as part of your main project. You will also be challenged to use your new skills in your field of work or interest.

 

This workshop is eligible for funding support. For more details, please refer to the "Pricing" tab above.

 

Agenda

 

Day 1 (20 February 2020)

 

08:45am – 09:00am: Registration
09:00am – 10:45am: KRecurrent Neural Networks (RNNs) Recap Part 1

  • RNNs
  • Long Short-Term Memory (LSTMs)
  • Word embeddings: Word2Vec, GloVE
  • Basic Char RNNs
  • Word RNNs
  • Build LSTM networks

10:45am – 11:00am: Tea Break
11:00am – 12:30pm: RNNs Recap Part 2
12:30pm – 1:30pm: Lunch
1:30pm – 3:00pm: Natural Language Processing (NLP) Part 1

  • Text classification models
  • Bidirectional LSTMs
  • Building a Named Entity Recogniser (NER) system
  • Sentiment analysis
  • Building a text classifier
  • Personal text project
  • Major Project Week 1

3:00pm – 3:15pm: Tea Break
3:15pm – 4:45pm: NLP Part 2
4:45pm – 5:15pm: Personal text project

  • Ideas on projects to do
  • Q&A on ‘doable projects’
  • Homework: What to bring for the next session

5:15pm – 5:45pm: Closing comments and questions

 

Day 2 (21 February 2020)

 

8:45am – 9:00am: Registration
9:00am – 10:45am: Seq2Seq and CNN for Text

  • Sequence to sequence models
  • Convolutions for text networks
  • Clustering
  • Seq2Seq Chatbot

10:45am – 11:00am: Tea Break
11:00am – 12:30pm: Project Clinic
Project questions and general follow up

12:30pm – 1:30pm: Lunch
1:30pm – 3:15pm: Time Series

  • Univariate vs Multivariate/Stationarity/TrendsWindowing and Differencing Arima/Sarima LSTM for Time Series ConvLSTM for Time Series

2:15pm – 3:30pm: Tea Break

3:30pm – 4:30pm: The Rise of the Language Models

4:30pm – 5:15pm: Closing comments and questions

 

Participants will be given two weeks to complete their online learning and individual project. 

 

Online Learning 

  • Building NLP models from scratch
  • NLP pipelines
  • Guide to using Spacy
  • Building a Chatbot Machine Learning system
  • Building a language model

Participants must fulfil the criteria stated below to pass and complete the course.

 

1.    Online Tests: Participants are required to score a minimum grade of more than 75% 

2.    Project: Participants are required to present and achieve a pass on a project that demonstrates the following:

  • The ability to use or create a data processing pipeline that gets data in the correct format for running in a Deep Learning model
  • The ability to create a model from scratch or use transfer learning to create a Deep Learning model
  • The ability to train your model and get results
  • The ability to evaluate your model on held-out data

Pricing

 

Funding Support

This workshop is eligible for CITREP+ funding.

 

CITREP+ is a programme under the TechSkills Accelerator (TeSA) – an initiative of SkillsFuture, driven by Infocomm Media Development Authority (IMDA).

 


*Please see the section below on ‘Guide for CITREP+ funding eligibility and self-application process’

 

Funding Amount: 

  • CITREP+ covers up to 90% of your nett payable course fee depending on eligibility for professionals

Please note: funding is capped at $3,000 per course application

  • CITREP+ covers up to 100% funding of your nett payable course fee for eligible students / full-time National Servicemen (NSF)

Please note: funding is capped at $2,500 per course application

Funding Eligibility: 

  • Singaporean / PR
  • Meets course admission criteria
  • Sponsoring organisation must be registered or incorporated in Singapore (only for individuals sponsored by organisations)

Please note: 

  • Employees of local government agencies and Institutes of Higher Learning (IHLs) will qualify for CITREP+ under the self-sponsored category
  • Sponsoring SMEs organisation who wish to apply for up to 90% funding support for course must meet SME status as defined here

Claim Conditions: 

  • Meet the minimum attendance (75%)
  • Complete and pass all assessments and / or projects

Guide for CITREP+ funding eligibility and self-application process:

For more information on CITREP+ eligibility criteria and application procedure, please click here

 

In partnership with:

   Driven by:

 

For enquiries, please send an email to learning@sginnovate.com

 

Trainer

 

 

Dr Martin Andrews
Martin has over 20 years’ experience in Machine Learning and has used it to solve problems in financial modelling and has created AI automation for companies. His current area of focus and speciality is in natural language processing and understanding. In 2017, Google appointed Martin as one of the first 12 Google Developer Experts for Machine Learning. Martin is also one of the co-founders of Red Dragon AI. 

 

 

Sam Witteveen
Sam has used Machine Learning and Deep Learning in building multiple tech start-ups, including a children’s educational app provider which has over 4 million users worldwide. His current focus is AI for conversational agents to allowa humans to interact easier and faster with computers. In 2017, Google appointed Sam as one of the first 12 Google Developer Experts for Machine Learning in the world. Sam is also one of the co-founders of Red Dragon AI. 

Have questions about Advanced NLP and Temporal Sequence Processing (20 – 21 February 2020)? Contact SGInnovate and Red Dragon AI

Save This Event

Event Saved

When & Where


TBC


Singapore

Thursday, February 20, 2020 at 9:00 AM - Friday, February 21, 2020 at 5:00 PM (Singapore Standard Time Singapore Time)


  Add to my calendar

Please log in or sign up

In order to purchase these tickets in installments, you'll need an Eventbrite account. Log in or sign up for a free account to continue.