Purdue Computer Science Undergrad Receives Honorable Mention for CRA Outstanding Researcher Award - Department of Computer Science - Purdue University Skip to main content

Purdue Computer Science Undergrad Receives Honorable Mention for CRA Outstanding Researcher Award

01-28-2020

Cameron Ruggles received an honorable mention for the 2020 Outstanding Undergraduate Researcher Award by the Computing Research Association (CRA). The award recognizes undergraduate students in North American universities who show outstanding potential in an area of computing research. Ruggles is a senior in the Department of Computer Science, completing a track in Machine Intelligence.

CameronRuggles1.png

Working with David Gleich, the Jyoti and Aditya Mathur Associate Professor of Computer Science, Ruggles’s research involves developing parallel algorithms for metric-constrained optimization. The paper scheduled to appear in the proceedings of SIAM CSC 2020 is titled, “A Parallel Projection Method for Metric Constrained Optimization.”

There are real-world data problems that cannot be solved efficiently by computers, meaning with a modestly sized dataset it would take hundreds of years to solve. Per Ruggles, “These are called NP-hard problems and they have very interesting applications. For example, the Traveling Salesman Problem (TSP), in which one identifies the shortest route to go to several destinations and return (like traveling to several houses for delivery) is inefficient to solve.”

To deal with these intractable problems, researchers often turn to approximation algorithms. These algorithms won’t get the exact answer, but they will get something that is provably almost as good.

“I am interested in approximation algorithms because it is an attempt to provide good solutions for problems that provably have no efficient solution, because of the unrealistic amount of time it would take to solve a trivial example using exact algorithms,” said Ruggles. 

One of the standard methods in approximation algorithms is to use metric constrained optimization. Metric constrained optimization hunts for a set of distances that approximately solve the original problem. Just like having distances on a map help us know what is near and far, the distances produced help the algorithm understand what is nearby and far away from a good solution.

Even though metric constrained optimization is widely used in theoretical studies of algorithms, it hasn’t always been used in practice. “Even though metric-based optimization algorithms are ‘easier’ than the exact algorithm, they are still extremely challenging and time-consuming problems to solve,” says research advisor Gleich. “Ruggles’s work found a new way to solve these problems using parallel computers that makes it feasible to do in a day what used to take weeks,” added Gleich.

As to future developments, Ruggles added, “Logistics and delivery companies hire computer scientists to solve the TSP using some of these approximation algorithms to save money on transportation costs. This is why I am so interested in them, because while using approximation algorithms we can provide usable solutions to problems. Particularly, where we already know how to compute the answer but can’t because of the time required. Potentially, we can unlock the solution to apply them in the real-world.”


Writer
: Emily Kinsell, 765-494-0669, emily@purdue.edu, @emilykinsell

Source: Cameron Ruggles, cruggles@purdue.edu

 


 

Abstract

A Parallel Projection Method for Metric Constrained Optimization

Cameron Ruggles, Nate Veldt, David F. Gleich

Many clustering applications in machine learning and data mining rely on solving metric-constrained optimization problems. These problems are characterized by $O(n^3)$ constraints that enforce triangle inequalities on distance variables associated with $n$ objects in a large dataset. Despite its usefulness, metric-constrained optimization is challenging in practice due to the cubic number of constraints and the high-memory requirements of standard optimization software. Recent work has shown that iterative projection methods are able to solve metric-constrained optimization problems on a much larger scale than was previously possible, thanks to their comparatively low memory requirement. However, the major limitation of projection methods is their slow convergence rate. In this paper we present a parallel projection method for metric-constrained optimization which allows us to speed up the convergence rate in practice. The key to our approach is a new parallel execution schedule that allows us to perform projections at multiple metric constraints simultaneously without any conflicts or locking of variables. We illustrate the effectiveness of this execution schedule by implementing and testing a parallel projection method for solving the metric-constrained linear programming relaxation of correlation clustering. We show numerous experimental results on problems involving up to 2.9 trillion constraints. 

 

Link to the full paper.

Last Updated: Nov 1, 2023 4:21 PM

Department of Computer Science, 305 N. University Street, West Lafayette, IN 47907

Purdue University Indianapolis, 723 W. Michigan St., Indianapolis, IN 46202

Phone: (765) 494-6010 • Fax: (765) 494-0739

Copyright © 2024 Purdue University | An equal access/equal opportunity university | Copyright Complaints | DOE Degree Scorecards

Trouble with this page? Accessibility issues? Please contact the College of Science.