Page Not Found
Page not found. Your pixels are in another canvas.
A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.
Page not found. Your pixels are in another canvas.
About me
This is a page not in th emain menu
Published:
This post will show up by default. To disable scheduling of future posts, edit config.yml
and set future: false
.
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Short description of portfolio item number 1
Short description of portfolio item number 2
Published in Foundations of Data Science, 2019
We study the use of power weighted shortest path distance functions for clustering high dimensional Euclidean data, under the assumption that the data is drawn from a collection of disjoint low dimensional manifolds. We argue, theoretically and experimentally, that this leads to higher clustering accuracy. We also present a fast algorithm for computing these distances
Recommended citation: McKenzie, Daniel and Damelin, Steven. (2019). "Power weighted shortest paths for clustering Euclidean data. " Foundations of Data Science. 1(3). https://arxiv.org/abs/1905.13345
Published in SIMODS, 2020
We apply tools from compressive sensing to the problem of finding clusters in graphs.
Download here
Published in GTA3 workshop at IEEE Big Data, 2020
We construct and analyze a knowledge graph for season one of the TV show Veronica Mars.
Download here
Published in SIOPT (to appear) , 2021
We propose a new zeroth-order optimization algorithm that uses compressive sensing to approximate gradients.
Download here
Published in ICML , 2021
We propose a new algorithm for ultra-high dimensional black-box optimization (over 1 million variables).
Download here
Published in (under review), 2021
We apply the FPN technology developed in an earlier work to the problem of predicting Nash equilibria in parametrized games.
Download here
Published in (under review), 2021
We propose a line search algorithm for zeroth-order optimization with low query complexity, both in theory and in practice.
Download here
Published in (Submitted to SIURO), 2021
We provide a criterion for converting ZO algorithms to the comparison-based setting.
Download here
Published in SIMODS (to appear) , 2021
We further explore the use of shortest path metrics on high dimensional data.
Download here
Published in submitted , 2021
We show how to convert the problem of minimizing a convex function over the standard probability simplex to that of minimizing a nonconvex function over the unit sphere..
Download here
Published in AAAI , 2022
We propose a new, much faster, kind of backprop for implicit-depth neural networks.
Download here
Published in ACHA , 2022
We use tools from one-bit compressed sensing to construct a new algorithm for comparison-based optimization.
Download here
Published:
Based on this paper. Here are the slides. This talk has been given virtually and in person at several other venues.
Published:
Based on this and this paper. Here are the slides. Similar versions of this talk were/ will be given at the ZiF Mathematics of Machine Learning conference and in the Mean-field games and optimal transport seminar.
Published:
Based on this paper. Here are the slides. This short ICML talk was actually presented by Yuchen Lou.
Undergraduate course, University of Georgia, 2014
( 2014–2019 ) While a graduate student at UGA I taught Math1113, a one semester precalculus course, six times. The textbook we used was Precalculus by Julie Miller and Donna Gerkin. For some sections we used the ALEKS homework system. You can find a sample syllabus here and a copy of my first day of class slides here.
Undergraduate course, University of Georgia, 2015
( 2014–2019 ) While a graduate student at UGA I taught Math2250, a one semester Calculus 1 course, three times. We used the textbook University Calculus, Early Transcendentals by Hass, Weir and Thomas. We used the WebWork homework system. I experimented with a variety of teaching modalities but one thing I found to be effective was creating a worksheet for every lesson which we would start in class and students would finish at home. You can find an example of such a worksheet here. You can find a copy of the syllabus here.
Undergraduate course, University of California, Los Angeles, 2019
Math 32A is a large (~200 students), one-quarter course on multivariable calculus. Teaching Math 32A was an interesting experience as it involved giving auditorium-style lectures as well as managing a grader and three TA’s who met with the students in smaller groups. You can find a copy of the syllabus here. The reviews were mostly favorable.
Undergraduate course, University of California, Los Angeles, 2020
( 2019–2021 ) At UCLA I co-developed and then taught (three times) Math 118, an introduction to the mathematics of data science. I try to emphasize both theory and practice, so some lectures are slide-based presentations while others are more interactive and use Jupyter notebooks to play around with algorithms. I am happy to share my complete set of lecture slides, but not yet willing to make them completely public, so email me if you would like access.
Undergraduate course, University of California, Los Angeles, 2020
Math 170S is a mathematically rigorous introduction to statistics, using Hogg, Tanis and Zimmerman’s Probability and Statistical Inference . This was the first course I taught entirely over Zoom.
Undergraduate course, University of California, Los Angeles, 2021
At UCLA I co-developed Math151AH and Math151BH, honors versions of the pre-existing applied numerical methods classes. This two course sequence examines the theory and implementation of algorithms to solve fundamental problems in numerical analysis, for example least squares and SVD decompositions. We used this textbook. Here are the syllabi for Math151AH and Math151BH, and here is a sample lecture on one of my favorite algorithms, the power method.
Undergraduate course, Colorado School of Mines, 2022
40 person section of Linear Algebra.
Graduate course, Colorado School of Mines, 2023
A graduate course on numerical linear algebra. We will focus on three fundamental problems; solving linear systems, solving least squares problems, and finding eigenvalues/eigenvectors. We’ll present multiple computational approaches for each. If you are a Mines student enrolled in this course you can find additional details on the canvas page.
Undergraduate course, Colorado School of Mines, 2023
40 person section of Linear Algebra. If you are enrolled in this class, please see our canvas page for further information.