Braden Hancock
Ph.D. Candidate - Machine Learning
I'm a third-year Computer Science Ph.D. student and NSF Fellow at Stanford University. My goal is to make it possible for anyone–regardless of programming ability or machine learning expertise–to create state-of-the-art machine learning systems in new domains in hours instead of months.
Research interests:
machine learning, weak supervision, learning from natural language, information extraction, semantic parsing, knowledge integration, program synthesis

News

  • Nov. 2017: Accepted a research internship position at Facebook AI Research (FAIR) with Antoine Bordes in Paris for Fall 2018!

  • Nov. 2017: "SystemX: Knowledge Base Construction from Richly Formatted Data" paper accepted to SIGMOD 2018.

  • Nov. 2017: "A Machine-Compiled Database of Genome-Wide Association Studies" paper accepted to NIPS MLCB workshop + spotlight presentation.

  • Oct. 2017: Invited to serve on Program Committee for "Learning with Limited Labeled Data: Weak Supervision and Beyond" NIPS 2017 workshop.

  • Oct. 2017: Presented on weak supervision and Babble Labble to Google Ads Quality team in Mountain View.

  • Oct. 2017: "Babble Labble: Learning from Natural Language Explanations" demo accepted to NIPS 2017.

Experience

Stanford University
2015-Present
Groups: Stanford DAWN, Stanford NLP Group, Stanford InfoLab
Mentors: Chris Ré, Percy Liang
Topics: Weak supervision, natural language supervision, information extraction
Google
Summer 2017
Groups: Google Brain, Google Search
Mentors: Hongrae Lee, Cong Yu, Quoc Le
Topics: Abstractive summarization of semi-structured content, recursive neural networks
MIT Lincoln Laboratory
Summers 2014-2015
Group: Computing & Analytics
Mentors: Vijay Gadepally, Jeremy Kepner
Topics: Recommender systems for Department of Defense applications, cryptography
Johns Hopkins University
Summer 2013
Group: Human Language Technology Center of Excellence
Mentors: Mark Dredze, Glen Coppersmith
Topics: Public health trend extraction from social media, topic modeling
Brigham Young University
2011-2015
Group: Design Exploration Research Group
Mentor: Chris Mattson
Topics: Multi-objective optimization, design space exploration
Air Force Research Laboratory
Summer 2011
Group: Turbine Engine Division
Mentor: John Clark
Topics: Evolutionary algorithms for optimization, turbine engine simulation

Education

Stanford University
Ph.D. Computer Science (Jun. 2020)
Advisor: Chris Ré
Machine Learning Emphasis (GPA 4.00)
Brigham Young University
B.S. Mechanical Engineering, Mathematics Minor (Apr. 2015)
Advisor: Chris Mattson
Valedictorian, summa cum laude (GPA 4.00)

Research

Babble Labble: Learning from Natural Language Explanations
We explore collecting natural language explanations for why annotators give the labels they do and parsing these into executable functions, which can then be applied to large amounts of unlabeled data. The resulting probabalistically labeled training dataset can then be used to train a powerful downstream discriminative model. We find that utilizing these natural language explanations improves end model performance many times faster than when using labels alone.
In Progress, NIPS 2017 Demo
Automatic Table Title Generation with a Pointer-Generator Network
We introduce a framework for generating titles for tables that are displayed out of their original context. We use a pointer-generator network, a recently introduced sequence-to-sequence model that is capable of both generating tokens and copying tokens from the input (such as rare and out-of-vocab words), resulting in titles that are both relevant and readable.
In Progress
Fonduer: Knowledge Base Construction from Richly Formatted Data
We introduce an information extraction framework that utilizes multiple representations of the data (structural, tabular, visual, and textual) to achieve state-of-the-art performance in four real-world extraction taks. Our framework is currently in use commercially at Alibaba and with law enforcement agencies fighting online human trafficking.
SIGMOD 2018
Snorkel: A System for Fast Training Data Creation
Snorkel is a system for rapidly creating, modeling, and managing training data. It is the flagship implementation of the new data programming paradigm for supporting weak supervision resources. Development is ongoing, with collaborators and active users at over a dozen major technical and medical organizations (e.g., Toshiba, Jet Propulsion Laboratory (JPL), Alibaba, Chegg, Stanford Medicine, etc.). As one of the core contributors to Snorkel, I have implemented many of my other research products as extensions to this framework.
VLDB 2018
A Machine-Compiled Database of Genome-Wide Association Studies
Using the multi-modal parsing and extraction tools from Fonduer and learning and inference tools from Snorkel, we construct a knowledge base of genotype/phenotype associations extracted from the text and tables in ~600 open-access papers from PubMed Central. Our system expands existing manually curated databases by approximately 20% with 92% precision.
Bio-Ontologies 2017, NIPS 2017 MLCB Workshop
Collective Supervision of Topic Models for Predicting Surveys with Social Media
We use topic models to correlate social media messages with survey outcomes and to provide an interpretable representation of the data. Rather than rely on fully unsupervised topic models, we use existing aggregated survey data to inform the inferred topics, a class of topic model supervision referred to as collective supervision.
AAAI 2016
Recommender Systems for the Department of Defense and Intelligence Community
With an internal committee of 20 MIT and DoD researchers, I spearheaded the construction of this report, which formalizes the components and complexities of recommender systems and surveys their existing and potential uses in the Department of Defense and U.S. Intelligence community.
MITLL Journal 2016
L-dominance: An approximate-domination mechanism for adaptive resolution of Pareto frontiers
We propose a mechanism called L-dominance (based on the Lamé curve) which promotes adaptive resolution of solutions on the Pareto frontier for evolutionary multi-objective optimization algorithms.
SMO Journal, AIAA ASM 2015, Honors Thesis
Best Student Paper
Reducing Shock Interactions in a High Pressure Turbine via 3D Aerodynamic Shaping
We show that the shock wave reflections inside a turbine engine can be approximated by calculating the 3D surface normal projections of the airfoils. Using a genetic algorithm, We produce superior airfoil geometries (with respect to high cycle fatigue failure) four orders of magnitude faster than the traditional CFD-based approach.
AIAA Journal, AIAA ASM 2014
Best Student Paper
The Smart Normal Constraint Method for Directly Generating a Smart Pareto Set
We introduce the Smart Normal Constraint (SNC) method, the first method capable of directly generating a smart Pareto set (a Pareto set in which the density of solutions varies such that regions of significant tradeoff have the greatest resolution). This is accomplished by iteratively updating an approximation of the design space geometry, which is used to guide subsequent searches in the design space.
SMO Journal, AIAA MDO 2013
Usage Scenarios for Design Space Exploration with a Dynamic Multiobjective Optimization Formulation
We investigate three usage scenarios for formulation space exploration, building on previous work that introduced a new way to formulate multi-objective problems, allowing a designer to change up update design objectives, constraints, and variables in a fluid manner that promotes exploration.
RiED Journal, ASME DETC 2012
Best Paper

Awards

National Science Foundation Graduate Research Fellowship (NSF GRF)
2015
National Defense Science and Engineering Graduate Fellowship (NDSEG)
2015 (declined for incompatibility with NSF)
Phi Kappa Phi Marcus L. Urann Fellowship
2015 (1 of 6 in USA)
Stanford School of Engineering Finch Family Fellowship
2015
AIAA Vicki and George Muellner Scholarship
2014 (1 of 1 in USA)
Barry M. Goldwater Scholarship
2013
AIAA Orville and Wilbur Wright Scholarship
2013 (1 of 3 in USA)
ASME Kenneth Andrew Roe Scholarship
2012 (1 of 1 in USA)
National Merit Scholarship
2011
BYU Thomas S. Monson Presidential Scholarship
2011 (1 of 50)

Last updated on 4 Nov 2017.