Braden Hancock
Ph.D. Candidate - Machine Learning
I'm a third-year Computer Science Ph.D. Candidate and NSF Fellow at Stanford University. I research how to get supervision signal from a human into a model as quickly, easily, and efficiently as possible. My ultimate goal is to make it possible for anyone—regardless of programming ability or machine learning expertise—to create near state-of-the-art machine learning systems in new domains in hours instead of months.
Research interests:
machine learning systems, weak supervision, multi-task learning, training data creation, learning from natural language, information extraction, natural language processing, knowledge integration

News

  • Jul. 2018: Presenting "Training Classifiers with Natural Language Explanations" paper (Babble Labble) at ACL (long talk).

  • Jul. 2018: A pre-print of "Title Generation for Web Tables" has been posted to arXiv.

  • Jun. 2018: "Snorkel MeTaL: Weak Supervision for Multi-Task Learning" presented (long talk) at DEEM (SIGMOD workshop).

  • Jun. 2018: "Fonduer: Knowledge Base Construction from Richly Formatted Data" presented (long talk) at SIGMOD.

  • May 2018: Chris Potts created an excellent assignment for Stanford CS224U (Natural Language Understanding) based on data programming/Snorkel.

  • Mar. 2018: Presented QALF and Snorkel MeTaL at the Spring Stanford DAWN retreat in Santa Cruz.

  • Dec. 2017: The Stanford Statistical ML group has had an incredible 2017 with 5 best paper awards! (NIPS, UAI, ICML, COLT, AAAI)

  • Dec. 2017: Presented Babble Labble demo at NIPS 2017.

  • Nov. 2017: Accepted a research internship position at Facebook AI Research (FAIR) with Antoine Bordes in Paris for Fall 2018!

  • Nov. 2017: Babble Labble was mentioned in the Biomedical Computational Review.

  • Nov. 2017: "A Machine-Compiled Database of Genome-Wide Association Studies" paper accepted to NIPS MLCB workshop + spotlight presentation.

  • Oct. 2017: Invited to serve on Program Committee for "Learning with Limited Labeled Data: Weak Supervision and Beyond" NIPS 2017 workshop.

  • Oct. 2017: Presented on weak supervision and Babble Labble to Google Ads Quality team in Mountain View.

Experience

Stanford University
2015-Present
Groups: Stanford StatsML, Stanford DAWN, Stanford NLP Group, Stanford InfoLab
Mentors: Chris Ré, Percy Liang
Topics: Weak supervision, multi-task learning, information extraction
Google
Summer 2017
Groups: Google Brain, Google Search
Mentors: Hongrae Lee, Cong Yu, Quoc Le
Topics: Abstractive summarization of semi-structured content, recurrent neural networks
MIT Lincoln Laboratory
Summers 2014-2015
Group: Computing & Analytics
Mentors: Vijay Gadepally, Jeremy Kepner
Topics: Recommender systems for Department of Defense applications, cryptography
Johns Hopkins University
Summer 2013
Group: Human Language Technology Center of Excellence
Mentors: Mark Dredze, Glen Coppersmith
Topics: Public health trend extraction from social media, topic modeling



Brigham Young University
2011-2015
Group: Design Exploration Research Group
Mentor: Chris Mattson
Topics: Multi-objective optimization, design space exploration
Air Force Research Laboratory
Summer 2011
Group: Turbine Engine Division
Mentor: John Clark
Topics: Evolutionary algorithms for optimization, turbine engine simulation

Research

Snorkel MeTaL: Weak Supervision for Multi-Task Learning
There is increasing interest in both multi-tasking learning (MTL) and weak supervision. We propose an end-to-end system for multi-task learning that leverages weak supervision provided at multiple levels of granularity. MeTaL learns a re-weighted model of these weak supervision sources that takes into account their place in overall hierarchy of sub-tasks, then uses the combined signal to train a multi-task network that is automatically compiled from the structure of the sub-tasks.
DEEM (SIGMOD) 2018 (oral), In Progress
Babble Labble: Learning from Natural Language Explanations
We explore collecting natural language explanations for why annotators give the labels they do and parsing these into executable functions, which can then be used to generate noisy labels for large amounts of unlabeled data. The resulting probabalistically labeled training dataset can then be used to train a powerful downstream discriminative model for the task at hand. We find that utilizing these natural language explanations allows real-world users to train classifiers with comparable F1 scores up to 100 times faster than when they provide just labels.
ACL 2018 (oral), NIPS 2017 Demo
Snorkel: A System for Fast Training Data Creation
Snorkel is a system for rapidly creating, modeling, and managing training data. It is the flagship implementation of the new data programming paradigm for supporting weak supervision resources. Development is ongoing, with collaborators and active users at over a dozen major technical and medical organizations (e.g., Intel, Toshiba, Jet Propulsion Laboratory (JPL), Alibaba, Stanford Medicine, etc.) and 1000+ stars on Github. As one of the core contributors to Snorkel, I have implemented many of my other research products as extensions to or new capabilities in this framework.
VLDB 2018
QALF: Information Extraction for the Long Tail via Question Answering
We use a Question Answering (QA) model as a flexible means of converting domain expertise expressed as natural language into weak supervision resources (labeling functions, or LFs). Preliminary results suggest that with as few as a dozen user inputs (domain-relevant questions), we can quickly build first-order extractors for new relations that lack distant supervision resources.
In Progress
Automatic Table Title Generation with a Pointer-Generator Network
We introduce a framework for generating titles for tables that are displayed out of their original context. We use a pointer-generator network, a recently introduced sequence-to-sequence model that is capable of both generating tokens and copying tokens from the input (such as rare and out-of-vocab words), resulting in titles that are both relevant and readable.
Fonduer: Knowledge Base Construction from Richly Formatted Data
We introduce an information extraction framework that utilizes multiple representations of the data (structural, tabular, visual, and textual) to achieve state-of-the-art performance in four real-world extraction taks. Our framework is currently in use commercially at Alibaba and with law enforcement agencies fighting online human trafficking.
SIGMOD 2018
A Machine-Compiled Database of Genome-Wide Association Studies
Using the multi-modal parsing and extraction tools from Fonduer and learning and inference tools from Snorkel, we construct a knowledge base of genotype/phenotype associations extracted from the text and tables in ~600 open-access papers from PubMed Central. Our system expands existing manually curated databases by approximately 20% with 92% precision.
Bio-Ontologies 2017, NIPS 2017 MLCB Workshop
Collective Supervision of Topic Models for Predicting Surveys with Social Media
We use topic models to correlate social media messages with survey outcomes and to provide an interpretable representation of the data. Rather than rely on fully unsupervised topic models, we use existing aggregated survey data to inform the inferred topics, a class of topic model supervision referred to as collective supervision.
AAAI 2016
Recommender Systems for the Department of Defense and Intelligence Community
With an internal committee of 20 MIT and DoD researchers, I spearheaded the construction of this report, which formalizes the components and complexities of recommender systems and surveys their existing and potential uses in the Department of Defense and U.S. Intelligence community.
MITLL Journal 2016

L-dominance: An approximate-domination mechanism for adaptive resolution of Pareto frontiers
We propose a mechanism called L-dominance (based on the Lamé curve) which promotes adaptive resolution of solutions on the Pareto frontier for evolutionary multi-objective optimization algorithms.
SMO Journal, AIAA ASM 2015, Honors Thesis
Best Student Paper
Reducing Shock Interactions in a High Pressure Turbine via 3D Aerodynamic Shaping
We show that the shock wave reflections inside a turbine engine can be approximated by calculating the 3D surface normal projections of the airfoils. Using a genetic algorithm, We produce superior airfoil geometries (with respect to high cycle fatigue failure) four orders of magnitude faster than the traditional CFD-based approach.
AIAA Journal, AIAA ASM 2014
Best Student Paper
The Smart Normal Constraint Method for Directly Generating a Smart Pareto Set
We introduce the Smart Normal Constraint (SNC) method, the first method capable of directly generating a smart Pareto set (a Pareto set in which the density of solutions varies such that regions of significant tradeoff have the greatest resolution). This is accomplished by iteratively updating an approximation of the design space geometry, which is used to guide subsequent searches in the design space.
SMO Journal, AIAA MDO 2013
Usage Scenarios for Design Space Exploration with a Dynamic Multiobjective Optimization Formulation
We investigate three usage scenarios for formulation space exploration, building on previous work that introduced a new way to formulate multi-objective problems, allowing a designer to change up update design objectives, constraints, and variables in a fluid manner that promotes exploration.
RiED Journal, ASME DETC 2012
Best Paper

Education

Stanford University
Ph.D. Computer Science (Jun. 2020)
Advisor: Chris Ré
Machine Learning Emphasis (GPA 4.00)
Brigham Young University
B.S. Mechanical Engineering, Mathematics Minor (Apr. 2015)
Advisor: Chris Mattson
Valedictorian, summa cum laude (GPA 4.00)

Awards

National Science Foundation Graduate Research Fellowship (NSF GRF)
2015
National Defense Science and Engineering Graduate Fellowship (NDSEG)
2015 (declined for incompatibility with NSF)
Phi Kappa Phi Marcus L. Urann Fellowship
2015 (1 of 6 in USA)
Stanford School of Engineering Finch Family Fellowship
2015
AIAA Vicki and George Muellner Scholarship
2014 (1 of 1 in USA)


Barry M. Goldwater Scholarship
2013
AIAA Orville and Wilbur Wright Scholarship
2013 (1 of 3 in USA)
ASME Kenneth Andrew Roe Scholarship
2012 (1 of 1 in USA)
National Merit Scholarship
2011
BYU Thomas S. Monson Presidential Scholarship
2011 (1 of 50)

Last updated on 29 Mar 2018.