Blog of Qing


  • Home

  • About

  • Tags

  • Categories

  • Archives

  • Search

Bilinear Interpolation

Posted on 2020-08-22 | In Math

Introduction to Bilinear Interpolation.

Read more »

Meta-Learning

Posted on 2020-08-01 | In Meta Learning

This post introduce a new field of machine learning, i.e., meta learning, whose aim is to learn the “learn ability”. For example, we train a model on multiple learning tasks, such as speech recognition, image classification and so on, then when it perform a new task like text classification, it will learn faster and better since it has seen a lot of similar recognition tasks. In a nutshell, meta-learning algorithms produce a base network that is specifically designed for quick adaptation to new tasks using few-shot data.

Read more »

DP-Graph Convolutional Networks

Posted on 2020-05-24 | In Deep Learning

Graph Convolutional Networks (GCNs) is one type of neural networks designed to work directly on graphs and leverage their structural information. In this post, I will introduce how information is propagated through the hidden layers of a GCN.

Read more »

LeetCode

Posted on 2020-02-09 | In Algorithm

Input Size V.S. Time Complexity

  1. 279 - Perfect Squares
Read more »

ML-Knowledge

Posted on 2019-12-13 | In Macheine Learning

All kinds of Machine learning methods.

Read more »

Math-Linear Algebra

Posted on 2019-12-11 | In Math

This is my study notes for MIT Linear Algebra.

Read more »

NLP-TorchText

Posted on 2019-07-16 | In NLP

If you’ve ever worked on a project for deep learning for NLP, you’ll know how painful and tedious all the preprocessing is. Before you start training your model, you have to:

  1. Read the data from disk
  2. Tokenize the text
  3. Create a mapping from word to a unique integer
  4. Convert the text into lists of integers
  5. Load the data in whatever format your deep learning framework requires
  6. Pad the text so that all the sequences are the same length, so you can process them in batch

Torchtext is a library that makes all the above processing much easier.

Read more »

DP-Clasical CNNs

Posted on 2019-07-07 | In Deep Learning

In this post, we’ll go into summarizing a lot of the new and important developments in the field of computer vision and convolutional neural networks.

The 9 Deep Learning Papers You Need To Know About (Understanding CNNs Part 3)

Read more »

NLP-Seq2Seq

Posted on 2019-07-01 | In NLP

There’s a whole class of NLP tasks that rely on sequential output, or outputs that are sequences of potentially varying length. For example,

  • Translation: taking a sentence in one language as input and outputting the same sentence in another language.
  • Conversation: taking a statement or question as input and responding to it.
  • Summarization: taking a large body of text as input and outputting a summary of it.

Therefore, in this post, sequence-to-sequence models are introduced, a deep learning-based framework for handling these types of problems.

Read more »

NLP-Dependency Parser

Posted on 2019-06-12 | In NLP

Dependency structure of sentences shows which words depend on (modify or are arguments of) which other words. These binary asymmetric relations between the words are called dependencies and are depicted as arrows going from the head (or governor, superior, regent) to the dependent (or modifier, inferior, subordinate).

Read more »
12…9

Qing Wong

90 posts
24 categories
68 tags
© 2021 Qing Wong
Powered by Hexo
|
Theme — NexT.Muse v5.1.4