Hello World. My name is Hao Tan(谭昊).

My research topic is natural language processing and my advisor is Mohit Bansal.

I received my BS degrees from Shanghai Jiao Tong University. I am a member of ACM honored class.

I am a PhD student in UNC CS dept since 2016.

mail address: airsplay@cs.unc.edu

saddle point

Tricks in training RNN(NN):
1. Initilization.
2. Gradient Clipping.
3. BOS and EOS
4. dropout (in: word embedding, out: output)
5. Adam optimizer
6. Attention: 1. Using tanh (jointly balabala)... 2. initial state/output could be zero. 3. LSTM is better than GRU. 4. The output should using s_n instead of s_{n-1}