Ivan Bogatyy

My interests are in machine learning and natural language understanding. I work in the NYC language research team where we built state of the art syntactic parsers SyntaxNet and Parsey McParseface, as well as their successors, DRAGNN and ParseySaurus. I focus on building large-scale distributed training for modular neural network models that learn from several NLU subtasks simultaneously and accumulate common knowledge across different facets of understanding language (transfer learning). I also worked briefly on transfer learning for computer vision at Stanford.

More recently, I have been looking into cryptography and game theory, discovering and helping fix a large-scale Ethereum smart contract vulnerability.

Before Google, I did competitive programming (top-12 nationwide in Russia) and studied mathematics. I graduated with a joint BSc/MSc degree in mathematics from Moscow State University (thesis), where I focused on probability theory and discrete mathematics.

Fun fact: I used to be pretty serious about online poker, turning $50 into $25000 over my freshman and sophomore years in college. The largest bluff I ever called was ~4x my monthly living budget at the time. Regrettably, my parents eventually counseled me out of this exciting career choice.

Google Publications

Previous Publications

  •   

    Crawling policies based on web page popularity prediction

    Liudmila Ostroumova, Ivan Bogatyy, Arseniy Chelnokov, Alexey Tikhonov, Gleb Gusev

    Advances in Information Retrieval. ECIR 2014. (2014)

  •   

    Studying page life patterns in dynamical web

    Alexey Tikhonov, Ivan Bogatyy, Pavel Burangulov, Liudmila Ostroumova, Vitaliy Koshelev, Gleb Gusev

    Proceedings of the 36th International ACM SIGIR Conference on Research and Development in Information Retrieval, ACM (2013), pp. 905-908