Titles are hyperlinked to pdf copies of the final project write-up. Course coordinator: Pat Keef

  • Author: Carl Felstiner

    Title: Alpha-Beta Pruning

    Abstract: This paper serves as an introduction to the ways computers are built to play games. We implement the basic minimax algorithm and expand on it by finding ways to reduce the portion of the game tree that must be generated to find the best move. We tested our algorithms on ordinary Tic-Tac-Toe, Hex, and 3-D Tic-Tac-Toe. With our algorithms, we were able to find the best opening move in Tic-Tac-Toe by only generating 0.34% of the nodes in the game tree. We also explored some mathematical features of Hex and provided proofs of them.

    Faculty Adviser: David Guichard

  • Author: Sarah Fix

    Title: Advanced Tests for Convergence

    Abstract: The primary objective of this paper is to discuss advanced tests of convergence for infinite series. Commonly used tests for convergence that are taught to students in early calculus classes, including the Comparison, Root, and Ratio Tests are not sufficient in giving results for more complicated infinite series. These frequently used tests are discussed in the paper, along with examples of infinite series that have interesting properties, in order to effectively examine the more advanced Kummer's and Raabe's Tests. We demonstrate some applications of these more generalized tests through examples where simpler tests fail to yield results. While the main focus of this project is on advanced tests for convergence, we also illustrate connections between the different tests.

    Faculty Adviser: Russ Gordon

  • Author: Tyler Landau

    Title: Classifications of Frieze Groups and an Introduction to Crystallographic Groups

    Abstract: We will be looking at two special infinite plane symmetry groups namely frieze and crystallographic (wallpaper) groups. Within each of these groups we aim to describe what patterns we can form, in particular what qualifications determine which of the 7 frieze or 17 wallpaper groups a given pattern is apart of. For the frieze groups, we will also look at the construction of each pattern, their isomorphism classes, and why there are only 7 of them.

    Faculty Adviser: Barry Balof

  • Author: Nathaniel Larson

    Title: The Bernoulli Numbers: A Brief Primer

    Abstract: In this primer, we explore the diverse properties of a rational sequence known as the Bernoulli numbers. Since the discovery of the numbers in the early eighteenth century, mathematicians have uncovered a vast web of connections between them and core branches of mathematics. We begin with an overview of the historical developments leading to the derivation of the Bernoulli numbers, then use a process similar to that of Jakob Bernoulli to derive the sequence, and finally consider a variety of applications. We hope, above all, to demonstrate how useful and unexpected mathematics can be.

    Faculty Adviser: Barry Balof

  • Author: Yurixy Lopez Martinez

    Title: The Monty Hall Problem

    Abstract: This paper begins by offering a detailed explanation of the solution to the Monty Hall Problem utilizing decision trees and mathematical concepts of conditional probability, mainly Bayes' Theorem. We will proceed to investigate the various versions of the problem that have risen throughout the years among scholars, mainly focusing on the benefits of a particular strategies. We will conclude by briefly discussing some applications of the Monty Hall Problem to other disciplines, mainly the probabilistic aspects.

    Faculty Adviser: Pat Keef

  • Author: Chaoyi Lou

    Title: Artificial Neural Networks: their Training Process and Applications

    Abstract: Beginning in the 1800s, scientists intended to study the workings of the human brain. The concept of Artificial Intelligence developed over many years, and thanks to more advanced computers and spacious memories, now we can have `Human-like' computer programs which can perform tasks for us. An Artificial Neural Network is one which learns from past data, and predicts for the future. Through this project, we not only gain a mathematical background of ANNs, but also touch base on those dealing with images, Convolutional Neural Networks.

    Faculty Adviser: Doug Hundley

  • Author: Taka Olds

    Title: Forbidden graph minors

    Abstract: By identifying forbidden graph minors, families of graphs with a certain property can be characterized in an additional manner. We will review Kuratowski's identification and proof of the forbidden topological minors of planar graphs. In addition, Robertson and Seymour's Graph Minor Theorem will be examined in relation to sets of forbidden minors. By applying these concepts to two well-studied graph properties, we will gain some insight into the significance of these famous results.

    Faculty Adviser: David Guichard

  • Author: Sarah Rothschild

    Title: An Exploratory Statistical Analysis of Gentrification and Neighborhood Change in Seattle

    Abstract: This study examines changing patterns of urban characteristics in Seattle, Washington from 2000 to 2017, focusing on patterns of urban displacement and segregation through statistical techniques and exploratory data analysis. Utilizing Census data and statistical software, we will analyze how Seattle neighborhoods have changed throughout the past few years with regard to socioeconomic status, race, and class. This project further employs data visualization and geospatial analysis, seeking to draw conclusions about the nature of gentrification in the city over recent decades.

    Faculty Adviser: Marina Ptukhina

  • Author: Alexander F. Shaw

    Title: Classifying Some Infinite Abelian Groups and Answering Kaplansky's Test Questions

    Abstract: In his influential title Infinite Abelian Groups, Irving Kaplansky posed two general questions designed to test classifications of abelian groups. This work answers the questions for a subclass of abelian p-groups that are entirely characterized by their socles (the subgroups with 0 and all elements of order p). The socle is generalized as a valuated vector space and much of this work is dedicated to classifying this generalization in terms of Ulm invariants. For these groups, the questions can thus be translated in two steps: first into the terms of socles and then into the terms of Ulm invariants. The first step is made by Fuchs and Irwin in [1]. This work makes the second step, building up the results and classifications while assuming only working knowledge of introductory algebra. In culmination, this work answers Kaplansky's test questions and gives an example to which the results apply.

    Faculty Adviser: Pat Keef

  • Author: Lori Sheng

    Title: Data-Dimensionality Reduction Using PCA, t-SNE, Sammon map and Autoencoder

    Abstract: In modern society, we are surrounded by a large amount of data from various fields: digital communication, education, economy, medical care and so on. We want to get valuable information from the data in order to evaluate and to predict results. However, raw data can be hard to interpret and harder to model as we may have missing data, data that contains a lot of noise or data that is extremely high dimensional. In order to deal with extremely high-dimensional data, we wish to somehow map the data to low dimensions (two or three dimensions) in order to visualize it. In this project, we will compare and explore several methods: Principal Component Analysis (PCA), t-Stochastic Neighbor Embedding(t-SNE), Sammon Map, and the autoencoder (a type of neural network).

    Faculty Adviser: Doug Hundley

  • Author: Sarah Vesneske

    Title: Continuous, nowhere differentiable functions

    Abstract: The main objective of this paper is to build a context in which it can be argued that most continuous functions are nowhere differentiable. We use properties of complete metric spaces, Baire sets of first category, and the Weierstrass Approximation Theorem to reach this objective. We also look at several examples of such functions and methods to prove their lack of differentiability at any point.

    Faculty Adviser: Russ Gordon