Sunday, July 29, 2018

Game Design Concepts 101

(draft in progress)

Gaming Mechanics

Four types of gamers especially in MMOGs
  • Achiever
  • Explorer
  • Socializer
  • Killer
Concept how do you measure gaming experience?

Modern games require design addictive cycles to keep the gamers engaged.
It's a big deal because can shake your moral ground. Making a game addictive is both making a successful product but also potentially doing harm to gamers.

Game Algorithms

In-Game Economy Design
Virtual goods are all the range. That's how a lot of freemium games and social games make a buck these days.

Online Social Games
Examples include Facebook games like FarmVille: get lots of traffic, viral factor, millions of people can play it each day (at its height 100 million plus players play online social games each day)

Some gaming companies got so huge, they entire focus shifted to analytics instead of game design.

Concept - Gamification
Making things that are not pure games have gaming elements and incentives to drive results. Gamification takes advantage of fun and addictive gaming mechanics to encourage results.

Summary statistics, segmentation, average per segmentation, user acquisition, conversion, life time customer value, lifetime spending,

Tools for Game Developers:

  • HTML Canvas can be used for gaming. The drawback is it doesn't have undo or redo, have to re-draw everything again, 

Tuesday, July 24, 2018

List of Natural Language Processing NLP and Machine Learning Papers

  • Andreas, J., Rohrbach, M., Darrell, T., Klein, D., 2016. Neural Module Networks, CVPR
  • Auli, M., Galley, M., Quirk, C. and Zweig, G., 2013. Joint language and translation modeling with recurrent neural networks. In EMNLP.
  • Auli, M., and Gao, J., 2014. Decoder integration and expected bleu training for recurrent neural network language models. In ACL.
  • Bahdanau, D., Cho, K., and Bengio, Y. 2015. Neural machine translation by joingly learning to align and translate, in ICLR 2015.
  • Bejar, I., Chaffin, R. and Embretson, S. 1991. Cognitive and psychometric analysis of analogical problem solving. Recent research in psychology.
  • Bengio, Y., 2009. Learning deep architectures for AI. Foundumental Trends Machine Learning, vol. 2.
  • Bengio, Y., Courville, A., and Vincent, P. 2013. Representation learning: A review and new perspectives. IEEE Trans. PAMI, vol. 38, pp. 1798-1828.
  • Bengio, Y., Ducharme, R., and Vincent, P., 2000. A Neural Probabilistic Language Model, in NIPS.
  • Berant, J., Chou, A., Frostig, R., Liang, P. 2013. Semantic Parsing on Freebase from Question-Answer Pairs. In EMNLP.
  • Berant, J., and Liang, P. 2014. Semantic parsing via paraphrasing. In ACL.
  • Bian, J., Gao, B., Liu, T. 2014. Knowledge-Powered Deep Learning for Word Embedding. In ECML.
  • Blei, D., Ng, A., and Jordan M. 2001. Latent dirichlet allocation. In NIPS.
  • Bordes, A., Usunier, N., Garcia-Duran, A., Weston, J. and Yakhnenko, O. 2013. Translating Embeddings for Modeling Multi-relational Data. In NIPS.
  • Bordes, A., Chopra, S., and Weston, J. 2014. Question answering with subgraph embeddings. In EMNLP.
  • Bordes, A., Glorot, X., Weston, J. and Bengio Y. 2012. Joint Learning of Words and Meaning Representations for Open-Text Semantic Parsing. In AISTATS.
  • Brown, P., deSouza, P. Mercer, R., Della Pietra, V., and Lai, J. 1992. Class-based n-gram models of natural language. Computational Linguistics 18 (4).
  • Chandar, A. P. S., Lauly, S., Larochelle, H., Khapra, M. M., Ravindran, B., Raykar, V., and Saha, A. (2014). An autoencoder approach to learning bilingual word representations. In NIPS.
  • Chang, K., Yih, W., and Meek, C. 2013. Multi-Relational Latent Semantic Analysis. In EMNLP.
  • Chang, K., Yih, W., Yang, B., and Meek, C. 2014. Typed Tensor Decomposition of Knowledge Bases for Relation Extraction. In EMNLP.
  • Collobert, R., and Weston, J. 2008. A Unified Architecture for Natural Language Processing: Deep Neural Networks with Multitask Learning. In ICML.
  • Collobert, R., Weston, J., Bottou, L., Karlen, M., Kavukcuoglu, K., and Kuksa, P., 2011. Natural language processing (almost) from scratch. in JMLR, vol. 12.
  • Cui, L., Zhang, D., Liu, S., Chen, Q., Li, M., Zhou, M., and Yang, M. (2014). Learning topic representation for SMT with neural networks. In ACL.
  • Dahl, G., Yu, D., Deng, L., and Acero, 2012. A. Context-dependent, pre-trained deep neural networks for large vocabulary speech recognition, IEEE Trans. Audio, Speech, & Language Proc., Vol. 20 (1), pp. 30-42.
  • Deerwester, S., Dumais, S. T., Furnas, G. W., Landauer, T., and Harshman, R. 1990. Indexing by latent semantic analysis. J. American Society for Information Science, 41(6): 391-407
  • Devlin, J., Cheng, H., Fang, H., Gupta, S., Deng, L., He, X., Zweig, G., and Mitchell, M., 2015. Language Models for Image Captioning: The Quirks and What Works, ACL
  • Deng, L., He, X., Gao, J., 2013. Deep stacking networks for information retrieval, ICASSP
  • Deng, L., Seltzer, M., Yu, D., Acero, A., Mohamed, A., and Hinton, G., 2010. Binary Coding of Speech Spectrograms Using a Deep Auto-encoder, in Interspeech.
  • Deng, L., Tur, G, He, X, and Hakkani-Tur, D. 2012. Use of kernel deep convex networks and end-to-end learning for spoken language understanding, Proc. IEEE Workshop on Spoken Language Technologies.
  • Deng, L., Yu, D. and Acero, A. 2006. Structured speech modeling, IEEE Trans. on Audio, Speech and Language Processing, vol. 14, no. 5, pp. 1492-1504.
  • Deng, L., Yu, D., and Platt, J. 2012. Scalable stacking and learning for building deep architectures, Proc. ICASSP.
  • Deng, L. and Yu, D. 2014. Deeping learning methods and applications. Foundations and Trends in Signal Processing 7:3-4.
  • Deoras, A., and Sarikaya, R., 2013. Deep belief network based semantic taggers for spoken language understanding, in INTERSPEECH.
  • Devlin, J., Zbib, R., Huang, Z., Lamar, T., Schwartz, R., and Makhoul, J., 2014. Fast and Robust Neural Network Joint Models for Statistical Machine Translation, ACL.
  • Duh, K. 2014. Deep learning for natural language processing and machine translation. Tutorial. 2014.
  • Duh, K., Neubig, G., Sudoh, K., and Tsukada, H. (2013). Adaptation data selection using neural language models: Experiments in machine translation. In ACL.
  • Fader, A., Zettlemoyer, L., and Etzioni, O. 2013. Paraphrase-driven learning for open question answering. In ACL.
  • Fang, H., Gupta, S., Iandola, F., Srivastava, R., Deng, L., Dollár, P., Gao, J., He, X., Mitchell, M., Platt, J., Zitnick, L., Zweig, G., “From Captions to Visual Concepts and Back,” arXiv:1411.4952
  • Faruqui, M. and Dyer, C. (2014). Improving vector space word representations using multilingual correlation. In EACL.
  • Faruqui, M., Dodge, J., Jauhar, S., Dyer, C., Hovy, E., Smith, N. 2015. Retrofitting Word Vectors to Semantic Lexicons. In NAACL-HLT.
  • Faruqui, M., Tsvetkov, Y., Yogatama, D., Dyer, C., Smith, N. 2015. Sparse Overcomplete Word Vector Representations. In ACL.
  • Firth, J. R. 1957. Papers in Linguistics 1934–1951, Oxford University Press, 1957
  • Frome, A., Corrado, G., Shlens, J., Bengio, S., Dean, J., Ranzato, M., and Mikolov, T., 2013. DeViSE: A Deep Visual-Semantic Embedding Model, Proc. NIPS.
  • Galárraga, L., Teflioudi, C., Hose, K., Suchanek, F. 2013. Association Rule Mining Under Incomplete Evidence in Ontological Knowledge Bases. In WWW.
  • Gao, J., He, X., Yih, W-t., and Deng, L. 2014a. Learning continuous phrase representations for translation modeling. In ACL.
  • Gao, J., He, X., and Nie, J-Y. 2010. Clickthrough-based translation models for web search: from word models to phrase models. In CIKM.
  • Gao, J., Pantel, P., Gamon, M., He, X., Deng, L., and Shen, Y. 2014b. Modeling interestingness with deep neural networks. In EMNLP
  • Gao, J., Toutanova, K., Yih., W-T. 2011. Clickthrough-based latent semantic models for web search. In SIGIR.
  • Gao, J., Yuan, W., Li, X., Deng, K., and Nie, J-Y. 2009. Smoothing clickthrough data for web search ranking. In SIGIR.
  • Gao, J., and He, X. 2013. Training MRF-based translation models using gradient ascent. In NAACL-HLT.
  • Getoor, L., and Taskar, B. editors. 2007. Introduction to Statistical Relational Learning. The MIT Press.
  • Graves, A., Jaitly, N., and Mohamed, A., 2013a. Hybrid speech recognition with deep bidirectional LSTM, Proc. ASRU.
  • Graves, A., Mohamed, A., and Hinton, G., 2013. Speech recognition with deep recurrent neural networks, Proc. ICASSP.
  • He, J., Chen, J., He, X., Gao, J., Li, L., Deng, L., Ostendorf, M., 2015 Deep Reinforcement Learning with an Action Space Defined by Natural Language, arXiv:1511.04636 (to appear on EMNLP16)
  • He, X. and Deng, L., 2013. Speech-Centric Information Processing: An Optimization-Oriented Approach, in Proceedings of the IEEE.
  • He, X. and Deng, L., 2012. Maximum Expected BLEU Training of Phrase and Lexicon Translation Models , ACL.
  • He, X., Deng, L., and Chou, W., 2008. Discriminative learning in sequential pattern recognition, Sept. IEEE Sig. Proc. Mag.
  • Hermann, K. M. and Blunsom, P. (2014). Multilingual models for compositional distributed semantics. In ACL.
  • Hinton, G., Deng, L., Yu, D., Dahl, G., Mohamed, A., Jaitly, N., Senior, A., Vanhoucke, V., Nguyen, P., Sainath, T., and Kingsbury, B., 2012. Deep Neural Networks for Acoustic Modeling in Speech Recognition, IEEE Signal Processing Magazine, vol. 29, no. 6, pp. 82-97.
  • Hinton, G., Osindero, S., and The, Y-W. 2006. A fast learning algorithm for deep belief nets. Neural Computation, 18: 1527-1554.
  • Hinton, G., and Salakhutdinov, R., 2010. Discovering binary codes for documents by learning deep generative models. Topics in Cognitive Science.
  • Hu, Y., Auli, M., Gao, Q., and Gao, J. 2014. Minimum translation modeling with recurrent neural networks. In EACL.
  • Huang, E., Socher, R., Manning, C, and Ng, A. 2012. Improving word representations via global context and multiple word prototypes, Proc. ACL.
  • Huang, P., He, X., Gao, J., Deng, L., Acero, A., and Heck, L. 2013. Learning deep structured semantic models for web search using clickthrough data. In CIKM.
  • Hutchinson, B., Deng, L., and Yu, D., 2012. A deep architecture with bilinear modeling of hidden representations: Applications to phonetic recognition, Proc. ICASSP.
  • Hutchinson, B., Deng, L., and Yu, D., 2013. Tensor deep stacking networks, IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 35, pp. 1944 - 1957.
  • Jansen, P., Surdeanu, M., Clark, P. 2014. Discourse Complements Lexical Semantics for Non-factoid Answer Reranking. In ACL.
  • Jurgens, D., Mohammad, S., Turney, P. and Holyoak, K. 2012. SemEval-2012 Task 2: Measuring degrees of relational similarity. In SemEval.
  • Jurafsky, D., & Martin, J. H. (2014). Speech and language processing (Vol. 3). London: Pearson.
  • Kafle, K., Kanan, C., 2016. Answer-Type Prediction for Visual Question Answering, CVPR
  • Kalchbrenner, N. and Blunsom, P. (2013). Recurrent continuous translation models., in EMNLLP
  • Kiros, R., Zemel, R., and Salakhutdinov, R. 2013. Multimodal Neural Language Models, Proc. NIPS Deep Learning Workshop.
  • Klementiev, A., Titov, I., and Bhattarai, B. (2012). Inducing crosslingual distributed representations of words. In COLING.
  • Kocisky, T., Hermann, K. M., and Blunsom, P. (2014). Learning bilingual word representations by marginalizing alignments. In ACL.
  • Koehn, P. 2009. Statistical Machine Translation. Cambridge University Press.
  • Krizhevsky, A., Sutskever, I, and Hinton, G., 2012. ImageNet Classification with Deep Convolutional Neural Networks, NIPS.
  • Landauer. T., 2002. On the computational basis of learning and cognition: Arguments from LSA. Psychology of Learning and Motivation, 41:43–84.
  • Lao, N., Mitchell, T., and Cohen, W. 2011. Random walk inference and learning in a large scale knowledge base. In EMNLP.
  • Lauly, S., Boulanger, A., and Larochelle, H. (2013). Learning multilingual word representations using a bag-of-words autoencoder. In NIPS.
  • Le, H-S, Oparin, I., Allauzen, A., Gauvain, J-L., Yvon, F., 2013. Structured output layer neural network language models for speech recognition, IEEE Transactions on Audio, Speech and Language Processing.
  • LeCun, Y., Bottou, L., Bengio, Y., and Haffner, P. 1998. Gradient-based learning applied to document recognition, Proceedings of the IEEE, Vol. 86, pp. 2278-2324.
  • Levy, O., and Goldberg, Y. 2014. Linguistic Regularities in Sparse and Explicit Word Representations. In CoNLL.
  • Levy, O., and Goldberg, Y. 2014. Neural Word Embeddings as Implicit Matrix Factorization. In NIPS.
  • Li, P., Hastie, T., and Church, K.. 2006. Very sparse random projections, in Proc. SIGKDD.
  • Li, P., Liu, Y., and Sun, M. (2013). Recursive autoencoders for ITG-based translation. In EMNLP.
  • Li, P., Liu, Y., Sun, M., Izuha, T., and Zhang, D. (2014b). A neural reordering model for phrase-based translation. In COLING.
  • Liu, S., Yang, N., Li, M., and Zhou, M. (2014). A recursive recurrent neural network for statistical machine translation. In ACL.
  • Liu, X., Gao, J., He, X., Deng, L., Duh, K., Wang, Y., 2015. Representation learning using multi-task deep neural networks for semantic classification and information retrieval, NAACL
  • Liu, L., Watanabe, T., Sumita, E., and Zhao, T. (2013). Additive neural networks for statistical machine translation. In ACL.
  • Lu, S., Chen, Z., and Xu, B. (2014). Learning new semi-supervised deep auto-encoder features for statistical machine translation. In ACL.
  • Maskey, S., and Zhou, B. 2012. Unsupervised deep belief feature for speech translation, in ICASSP.
  • Mesnil, G., He, X., Deng, L., and Bengio, Y., 2013. Investigation of Recurrent-Neural-Network Architectures and Learning Methods for Spoken Language Understanding, in Interspeech.
  • Mikolov, T., Kombrink, S., Burget, L., Cernocky, J., Khudanpur, S. 2011. Extensions of recurrent neural network based language model. In ICASSP.
  • Mikolov, T. 2012. Statistical Language Models based on Neural Networks, Ph.D. thesis, Brno University of Technology.
  • Mikolov, T., Chen, K., Corrado, G., and Dean, J. 2013. Efficient estimation of word representations in vector space, Proc. ICLR.
  • Mikolov, T., Kombrink,. S., Burget, L., Cernocky, J., Khudanpur, S., 2011. Extensions of Recurrent Neural Network LM. ICASSP.
  • Mikolov, T., Yih, W., Zweig, G., 2013. Linguistic Regularities in Continuous Space Word Representations. In NAACL-HLT.
  • Mikolov, T., Sutskever, I., Chen, K., Corrado, G., and Dean, J. 2013. Distributed Representations of Words and Phrases and their Compositionality. In NIPS.
  • Mnih, A., Kavukcuoglu, K. 2013. Learning word embeddings efficiently with noise-contrastive estimation. In NIPS.
  • Mnih, V., Kavukcuoglu, K., Silver, D., Graves, A., Antonoglou, I., Wierstra, D., Riedmiller, M., 2013. Playing Atari with Deep Reinforcement Learning, NIPS
  • Mohamed, A., Yu, D., and Deng, L. 2010. Investigation of full-sequence training of deep belief networks for speech recognition, Proc. Interspeech.
  • Mohammad, S., Dorr, Bonnie., and Hirst, G. 2008. Computing word pair antonymy. In EMNLP.
  • Narasimhan, K., Kulkarni, T., Barzilay, R., 2015. Language Understanding for Text-based Games Using Deep Reinforcement Learning. EMNLP
  • Ngiam, J., Khosla, A., Kim, M., Nam, J., Lee, H., and Ng, A. 2011. Multimodal deep learning, Proc. ICML.
  • Nickel, M., Tresp, V., and Kriegel, H. 2011. A three-way model for collective learning on multi-relational data. In ICML.
  • Niehues, J., Waibel, A. 2013. Continuous space language models using Restricted Boltzmann Machines. In IWLT.
  • Noh, H., Seo, P., Han, B., 2016. Image Question Answering Using Convolutional Neural Network With Dynamic Parameter Prediction, CVPR
  • Palangi, H., Deng, L., Shen, Y., Gao, J., He, X., Chen, J., Song, X., Ward R., 2016. Deep sentence embedding using long short-term memory networks: Analysis and application to information retrieval, IEEE/ACM Transactions on Audio, Speech, and Language Processing 24 (4), 694-707
  • Pennington, J., Socher, R., Manning, C. 2014. Glove: Global Vectors for Word Representation. In EMNLP.
  • Reddy, S., Lapata, M., and Steedman, M. 2014. Large-scale semantic parsing without question-answer pairs. Transactions of the Association for Computational Linguistics (TACL).
  • Sainath, T., Mohamed, A., Kingsbury, B., and Ramabhadran, B. 2013. Convolutional neural networks for LVCSR, Proc. ICASSP.
  • Salakhutdinov R., and Hinton, G., 2007 Semantic hashing. in Proc. SIGIR Workshop Information Retrieval and Applications of Graphical Models
  • Salton, G. and McGill, M. 1983. Introduction to Modern Information Retrieval. McGraw Hill.
  • Sarikaya, R., Hinton, G., and Ramabhadran, B., 2011. Deep belief nets for natural language call-routing, in Proceedings of the ICASSP.
  • Schwenk, H. 2012. Continuous space translation models for phrase-based statistical machine translation, in COLING.
  • Schwenk, H., Rousseau, A., and Attik, M., 2012. Large, pruned or continuous space language models on a gpu for statistical machine translation, in NAACL-HLT 2012 Workshop.
  • Seide, F., Li, G., and Yu, D. 2011. Conversational speech transcription using context-dependent deep neural networks, Proc. Interspeech
  • Shen, Y., He, X., Gao, J., Deng, L., and Mesnil, G. 2014. Learning Semantic Representations Using Convolutional Neural Networks for Web Search, in Proceedings of WWW.
  • Shen, Y., He, X., Gao, J., Deng, L., and Mesnil, G. 2014. A convolutional latent semantic model for web search. CIKM
  • Shih, K., Singh, S., Hoiem, D., 2016. Where to Look: Focus Regions for Visual Question Answering, CVPR
  • Silver, D., Huang, A., Maddison, C., Guez, A., Sifre, L., van den Driessche, G., Schrittwieser, J., Antonoglou, I., Panneershelvam, V., Lanctot, M., Dieleman, S., Grewe, D., Nham, J., Kalchbrenner, N., Sutskever, I., Lillicrap, T., Leach, M., Kavukcuoglu, K., Graepel, T., Hassabis, D., 2016. Mastering the game of Go with deep neural networks and tree search, Nature
  • Simonyan, K., Zisserman, A., 2015 Very Deep Convolutional Networks for Large-Scale Image Recognition. ICLR 2015
  • Socher, R., Chen, D., Manning, C., and Ng, A. 2013. Reasoning With Neural Tensor Networks For Knowledge Base Completion. In NIPS.
  • Socher, R., Huval, B., Manning, C., Ng, A., 2012. Semantic compositionality through recursive matrix-vector spaces. In EMNLP.
  • Socher, R., Lin, C., Ng, A., and Manning, C. 2011. Learning continuous phrase representations and syntactic parsing with recursive neural networks, Proc. ICML.
  • Socher, R., Perelygin, A., Wu, J., Chuang, J., Manning, C., Ng A., and Potts. C. 2013. Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank, Proc. EMNLP
  • Son, L. H., Allauzen, A., and Yvon, F. (2012). Continuous space translation models with neural networks. In NAACL.
  • Song, X. He, X., Gao. J., and Deng, L. 2014. Unsupervised Learning of Word Semantic Embedding using the Deep Structured Semantic Model. MSR Tech Report.
  • Song, Y., Wang, H., and He, X., 2014. Adapting Deep RankNet for Personalized Search. Proc. WSDM.
  • Songyot, T. and Chiang, D. (2014). Improving word alignment using word similarity. In EMNLP.
  • Sundermeyer, M., Alkhouli, T., Wuebker, J., and Ney, H. (2014). Translation modeling with bidirectional recurrent neural networks, in EMNLP.
  • Sutton, R., Barto, A., 1998. Reinforcement Learning: An Introduction. MIT Press.
  • Tamura, A., Watanabe, T., and Sumita, E. (2014). Recurrent neural networks for word alignment model. In ACL.
  • Tapaswi, M., Zhu, Y., Stiefelhagen, R., Torralba, A., Urtasun, R., Fidler, S., 2016. MovieQA: Understanding Stories in Movies Through Question-Answering, CVPR
  • Tran, K. M., Bisazza, A., and Monz, C. (2014). Word translation prediction for morphologically rich languages with bilingual neural networks. In EMNLP.
  • Tran, K., He, X., Zhang, L., Sun, J., Carapcea, C., Thrasher, C., Buehler, C., Sienkiewicz, C., “Rich Image Captioning in the Wild,” DeepVision, CVPR 2016
  • Tur, G., Deng, L., Hakkani-Tur, D., and He, X., 2012. Towards Deeper Understanding Deep Convex Networks for Semantic Utterance Classification, in ICASSP.
  • Turney P. 2008. A uniform approach to analogies, synonyms, antonyms, and associations. In COLING. Songyot, T. and Chiang, D. (2014). Improving word alignment using word similarity. In EMNLP.
  • Vaswani, A., Zhao, Y., Fossum, V., and Chiang, D. 2013. Decoding with large-scale neural language models improves translation. In EMNLP.
  • Wang, H., He, X., Chang, M., Song, Y., White, R., Chu, W., 2013. Personalized ranking model adaptation for web search, SIGIR Wang, Z., Zhang, J., Feng, J., Chen, Z. 2014. Knowledge Graph and Text Jointly Embedding. In EMNLP.
  • Watkins, C., and Dayan, P., 1992. Q-learning. Machine Learning
  • Wright, S., Kanevsky, D., Deng, L., He, X., Heigold, G., and Li, H., 2013. Optimization Algorithms and Applications for Speech and Language Processing, in IEEE Transactions on Audio, Speech, and Language Processing, vol. 21, no. 11.
  • Wu, Q., Wang, P., Shen, C., Dick, A., Hengel, A., 2016. Ask Me Anything: Free-Form Visual Question Answering Based on Knowledge From External Sources, CVPR
  • Wu, H., Dong, D., Hu, X., Yu, D., He, W., Wu, H., Wang, H., and Liu, T. (2014a). Improve statistical machine translation with context-sensitive bilingual semantic embedding model. In EMNLP.
  • Wu, Y., Watanabe, T., and Hori, C. (2014b). Recurrent neural network-based tuple sequence model for machine translation. In COLING.
  • Xu, C., Bai, Y., Bian, J., Gao, B., Wang, G., Liu, X., Liu, T. 2014. RC-NET: A General Framework for Incorporating Knowledge into Word Representations. In CIKM.
  • Yang, B., Yih, W., He, X., Gao, J., and Deng L. 2015. Embedding Entities and Relations for Learning and Inference in Knowledge Bases. In ICLR.
  • Yang, N., Liu, S., Li, M., Zhou, M., and Yu, N. 2013. Word alignment modeling with context dependent deep neural network. In ACL.
  • Yang, Y., Chang, M. 2015. S-MART: Novel Tree-based Structured Learning Algorithms Applied to Tweet Entity Linking. In ACL.
  • Yao, K., Zweig, G., Hwang, M-Y. , Shi, Y., Yu, D., 2013. Recurrent neural networks for language understanding, submitted to Interspeech.
  • Yao, X., Van Durme, B. 2014. Information Extraction over Structured Data: Question Answering with Freebase. In ACL.
  • Yann, D., Tur, G., Hakkani-Tur, D., Heck, L., 2014. Zero-Shot Learning and Clustering for Semantic Utterance Classification Using Deep Learning. In ICLR
  • Yogatama, D., Faruqui, M., Dyer, C., Smith, N. 2015. LearningWord Representations with Hierarchical Sparse Coding. In ICML.
  • Yih, W., Toutanova, K., Platt, J., and Meek, C. 2011. Learning discriminative projections for text similarity measures. In CoNLL.
  • Yih, W., Zweig, G., Platt, J. 2012. Polarity Inducing Latent Semantic Analysis. In EMNLP-CoNLL.
  • Yih, W., Chang, M., Meek, C., Pastusiak, A. 2013. Question Answering Using Enhanced Lexical Semantic Models. In ACL.
  • Yih, W., He, X., Meek, C. 2014. Semantic Parsing for Single-Relation Question Answering. In ACL.
  • Yih, W., Chang, M., He, X., Gao, J. 2015. Semantic parsing via staged query graph generation: Question answering with knowledge base, In ACL.
  • Zeiler, M. and Fergus, R. 2013. Visualizing and understanding convolutional networks, arXiv:1311.2901, pp. 1-11.
  • Zhang, J., Liu, S., Li, M., Zhou, M., and Zong, C. (2014). Bilingually-constrained phrase embeddings for machine translation. In ACL.
  • Zhu, Y., Groth, O., Bernstein, M., Fei-Fei, L., 2016. Visual7W: Grounded Question Answering in Images, CVPR
  • Zou, W. Y., Socher, R., Cer, D., and Manning, C. D. (2013). Bilingual word embeddings for phrase-based machine translation. In EMNLP.

Thursday, July 19, 2018

Reinforcement Learning Q Learning

Explore <s, a> ---> s' reads: move from current state s to s' via action a.  Through the action a reward is received, it can be positive for positive reinforcement, negative for punishment or discouragement. As the robot explores the environment, the agent will update the Q table which tracks the scores of accumulated scores.

Bellman Equation is one of the utility equations used to track scores.
U(s) = R(s) + ɣ max_a Σ (s,a,s') U(s')
The function none linear. This fancy function means current utility is a function of reward, a multiplier or a fraction of the max total future actions and future rewards.

Start with arbitrary utility, explore, and update based on allowed neighboring moves, based on the states it can reach. Update at every iteration.

Wednesday, July 18, 2018

F1 Score - Machine Learning

F1 Score is an useful metric of classification models rather than regression machine learning models. It is an useful metrics for models that also go well with confusion matrix. F1 score is an useful machine learning metrics aka performance score that is also frequently used in statistical analysis. You can read more about F1 score on the wikipedia page and also the sklearn F1 score documentation below:

F1 Score and Accuracy scores are both used in classification tasks. Accuracy score has some shortfalls. For example, if the dataset is obviously biased. For example, if most of the input data is negative (of the negative class only), say 99.99%. Then the machine does not need to explicitly learn anything intelligent. It can just guess "negative"every time, it will still be 99.99% accurate. F1 score is a shorthand to measure a composite score of the confusion matrix - true positive, true negative, false positive, false negative.

F1 score is a combination of recall and precision. It also a shorthand to measure how accurate and useful the result is.

Accuracy is a simple fraction of correctly classified objects over total number of objects.

It can be misleading to only focus on accuracy, especially when data labels are imbalanced, even if data is representative. Certain scenarios are simply more prevalent in the population data. For example, by definition orphan diseases are the minority data points in the real world. 

Get the most out of your Udacity Nanodegree and Subscription

Udacity is pricey. You are on a budget and desperately need a better job. Here are some great tips to take advantage of your Udacity subscription.

Udacity Career Partners and Career Hub

Udacity offer video tutorials for technology as well as how to write a resume, start a startup and more. In addition, each nanodegree is created in partnership with top tech companies, take advantage of these hidden connections. Reach out to content creators and industry leaders from Google.

Udacity Career Conferences

It's real. It works. There are actually top Silicon Valley companies come to review your resume and interview you in person. Highly recommend. I can give a lot of personal anecdote about how well this worked out for me. 

Udacity Career Profile

Completed multiple nanodegrees? You can turn your "ADHD" and inability to stop learning forever into a career advantage: show that you had a the grit and resourcefulness to complete multiple nanodegrees on your career profile. Update it regularly. 

Make your Capstone Project Portfolio Ready

These days, companies hire if you have a great portfolio not a great label. Turn your capstone project into a recruiter ready, professional medium post, a github repo, Linkedin ready slides or PDF. Do this while completing the capstone. It is so much easier. Once you are done with the nanodegree, it's really hard to go back.

For example, Udacity digital marketing project slides are presentation ready. And you get real-world experience marketing for Udacity on Adwords, Facebook and Instagram. 


Though not always helpful, Udacity Nanodegree subscription does come with an online. Remember, you can always request to change mentor if you have any trouble. 

Udacity Forum

A great place for discussion, gaining traffic, and ask for advice. 

Tuesday, July 17, 2018

Better Relationships in Silicon Valley

You are here to win and start a startup, but the journey of being an entrepreneur can be lonely, especially if you are a solopreneur. Have you thought about starting a meaningful relationship while you are here? Here are some tips and resources for you.

Hinge: a professional Tinder like dating app but often for Ivy League educated young professionals

Coffee Meets Bagel blog: another dating app offer some advice on their blog.

A Ted talk about relationships more Ted relationship talks

Former OkCupid blogger made famous insights and findings about relationships and online dating profiles. He has turned those insights into a full book.

Monday, July 16, 2018

100 Social Networks, Resources and Sites for Jobs in Silicon Valley


A women friendly job site completed with great tips, startup office infos and perks, and other pretty things that help ease the job searching stress.


Tinder but for business professionals. Wave match you with young business professionals that you should connect with.


Who says work and love have to be separated? Meet young business and tech professionals using Hinge, a dating app popular in Silicon Valley. 

Silicon Valley HBO TV Show

Need to learn to talk-the-talk walk-the-walk in the Valley? Get inspired and entertained by watching Silicon Valley the HBO show. 

100 Amazing Tools and Resources for Startup Founders

Brainstorm Startup Ideas and Domain Names Using Generators

One surprisingly easy hack is to precede the startup name with "try" or "get", example: getAlto, tryAlto

Bootstrap - Front End Framework

Previously Twitter Bootstrap is a super popular framework for front-end development. 

Use GSuite or Google Domain to Host Your Custom Domain Email

Want to have instead of so that your company looks official and trust worthy? Use Google Work's gmail hosting or the email forwarding service of Google Domain (only works one way - only receives custom domain emails).

Trello Board

Organize development cycles and sprints with Trello Board.

Material Design - Front End Framework and Stylebook

Google's Material Design helps you design Android like apps, paper like animations and layout.

Flat UI - CSS Framework

Design Google Material Design like UI, flat design. 

Prototyping, Wireframing Tools

Invision, Marvel

Marvel can easily mock up mobile apps in minutes, for free.

Free Professional Apps for iPad

Expensive apps like Adobe and premium MailChimp features are actually available in various forms of iPad apps. You can use advanced features for free! MailChimp even have an offline app for collecting emails at events and conferences.


Get gigs done on Fiverr for $5 dollars and up.

Reddit, Product Hunt, Imgur, Hacker News - are all important social networks for founders at Startups

Outsource and Delegate to Offsite Teams

Google Voice

Use google voice to set up a separate number for work.

Protect Your Intellectual Property

File patents, copy right protection.


Google Trend - Product Research & Market Research - Growth

Yoast - SEO for WordPress

Make Gifs and Screenshots

Some Chrome Extensions also have the capability. 


Unsplash - Stock Photo

Unsplash provides high-quality, startup-friend, royalty free stock photos. 

Code Libraries

In addition to frameworks, there are also jQuery UI, WordPress themes, 

WordPress Themes

Startup themes are available for purchase on ThemeForest. These templates will make your website look instantly like a legitimate startup. However, for WordPress speed is a serious concern. Without the snappy speed, startup websites will give off the wrong vibe. How can you raise funds for your tech startup if your website is slow? 

Bootstrap, a popular front-end frame, can be easily integrated with WordPress.

Hire Designers on 99 Design and Fiverr

Splashthat - Make Instant Event Invite Pages

These pages are called splash landing pages. Splash refers how instant and short-lived the pages are, usually used for a particular event or a purpose, or an Optimizely experiment. 

Pinterest Board for Web Design Mood Boarding

Pinterest Rich Pins are content friendly, sophisticated pins (with specialized meta tags) that display detailed information such as recipe, blog article, and Buyable Pins.

Add E-Commerce Capability to Your Youtube Channel and Games

Alto's Adventure and Odyssey - a top selling iOS game uses Shopify as its e-commerce platform to sell specialty merchandise such as stuffed llama - a beloved avatar int he game. 

----------------------- ----------------------- ----------------------- -----------------------
More Tips and Tools:

  • Understand your customer journey Google Customer Journey
  • Practice the philosophy of running a lean startup
  • How to boost employee happiness without spending any money by Fast Company
  • MailChimp newsletter integration with Shopify and eBay
  • Co-working spaces are more than just physical spaces for an office. Those are also great places to connect and meet with people. It is a real community valuable for entrepreneurs, especially solopreneurs. 
  • Mock up API calls
  • Web Scraping tool Careful most sites are protected with Term of Use, which generally prohibits scraping for commercial purpose. 
  • Organize your code snippets with public and private gists on github
  • version control for designers pixelapse (acquired by dropbox)
  • Use lint and validator tools to validate codes
  • Business Plan
  • Wells Fargo Business Plan tool and Business Intelligence tool
  • Business Plan tool
  • Calendar mangagement:  User -

Statistics Basic 101

Statistics is a dark science ... until you understand it.

Core concepts:

Sample vs Population
Establish a hypothesis for the research
Statistics significance and statistical theories

Evaluating Statistics:
Did we achieve the research objective?
Did we find support for the hypothesis?
What is the conclusion?
What is the next step?

100 Amazing Coding, Machine Learning, Data Science Courses Tutorials on the Internet

While learning to code, bettering your coding schools, online learners should avoid getting stuck in the ocean of tutorials and videos - do not get stuck in learners' limbo. It is impractical to know all the details of a framework. Not every pilot knows how to build a plane, not every machine learner needs to know all the math of all the algorithms!
  • Udacity
    • Web Development
    • JavaScript Design patterns
    • How to Build a Startup
    • OOP with JavaScript
    • iOS Swift
      • optionals
    • Algorithms
    • Data Analytics
    • Intro to Machine Learning
    • Product Design
    • Firebase for iOS
    • Introduction to Firebase
    • Linear Algebra Review
    • JAVA
    • D3
    • JavaScript
      • OOP JavaScript
    • Startup
      • how to build a startup
    • Git and Github, version control

  • Coursera 
  • Berkeley School of Information
  • Stanford CS 101 cousera 
  • R programming and genetics algorithms by Johns Hopkins on Coursera
  • Bio informatics, Data Science Johns Hopkins on Coursera
  • Check out our blog post on free Udacity Baidu self driving car seminar
  • Stanford intro to logic past
  • LittleBits teaches hardware and software engineering experience to kids. It is slightly more accessible than Raspberry Pi. Comes with a variety of sensors and components, such as pressure sensor, light sensor, temperature sensor etc.
  • Tutorial code test quiz -
  • W3Schools
    • HTML Dom Events
    • Free tutorials on Angular by w3schools
    • CSS
    • Angular
  • AngularJS
    • Coursera
  • Platzi learning
    • Once funded by YCombinator. It invites prominent speakers including YC leaders to talk about startup, finance, data, growth and more. 
  • Manning Book Practical Data Science with R
  • Coursera
  • Chinese MOOCs
  • Code School was once pretty good and creative. It became a more generic professional training site after being acquired by pluralsight. 
  • Staying Sharp with Angular.js JAVASCRIPT
  • Building Blocks of Express.js JAVASCRIPT
  • Mastering GitHub GIT
  • Shaping up with Angular.js JAVASCRIPT
  • Surviving APIs with Rails RUBY
  • Warming Up With Ember.js JAVASCRIPT
  • Front-end Formations HTML/CSS
  • Core iOS 7 IOS
  • Rails 4 Patterns RUBY
  • jQuery: The Return Flight JAVASCRIPT
  • Try iOS IOS
  • Ruby Bits Part 2 RUBY
  • Ruby Bits RUBY
  • Try Git GIT
  • Real-time Web with Node.js JAVASCRIPT
  • Microsoft AI school
  • Coursera others
    • wesleyan - creative writing program
    • johns hopkins - rails angularjs mongodb html css javascript
    • wharton - analytics marketing
  • d3 high quality well explained tutorials
  • Meteor Tut+
  • Rails
    • rails
    • one month rails
    • soup to bits code school rails and real life examples
  • Stanford iOS iPhone app class on iTune U
  • JavaScript AirPair
  • Design lessons for developers
  • Startup prototyping code4startups
  • Google developer channel
  • Game probabilities

WordPress Tutorials and Classes

Treehouse offers WordPress tutorials and classes for reasonable prices - a flat monthly subscription. Treehouse also teaches theme development 

Product Design, Product Management Classes

Udacity offers a product design class -

Learn how to build vector graphics: Sketch website has its own tutorials -
- Online book on Sketch
- Treehouse offers a Sketch prototyping class
- - I downloaded a discounted version of Sketch using a treehouse promo code once. 

WebDesignerLedger a website by designers for designers

How to start a startup by Stanford and YCombinator - the best startup accelerator

A startup class Platzi - YCombinator alum in collab with YC partners

JavaScript Classes, Front End and Full Stack JavaScript

Treehouse offers Node.js classes. Udacity offers Firebase JavaScript and Angular classes.

React has been very popular. Get started with React in lieu of JavaScript on Facebook React website

Data Visualization Data Analysis, Data Science Classes

Treehouse offers a D3 class. Udacity is the king of Python Data Science, Data Analysis and Machine Learning, Artificial Intelligence classes.  Udacity offers Firebase classes in collaboration with Google!

 Web Security, Crypto Classes

One month offers a white hat hacker class -

Cloud Engine

For Google Cloud you want to look on Google's website and Coursera. For Amazon, you can use amazon's developer site, lots of free webinars. 

Pro Tips for Online Learners

Use Kindle e-ink to avoid eye strains.

Learning Self-Driving Car Engineer for FREE with Udacity and Baidu

Udacity and Baidu partners up to teach autonomous vehicle driving and engineering fundamentals for FREE! This course will center around Baidu's free open source Apollo self-driving car framework.

Normally Udacity nanodegrees and courses cost north of $599 at launch, $200 per month or even $2000+ for the entire nanodegree. This free course is an amazing news for all learn to coders.

The course is launching soon, sign up and be notified

React UI, UI UX, Reactstrap React Bootstrap

React UI MATERIAL  Install yarn add @material-ui/icons Reactstrap FORMS. Controlled Forms. Uncontrolled Forms.  Columns, grid