Close

Mathematics

Mathematics Graduate Seminar - Fall 2023 and Spring 2024

The seminar is scheduled from 1:30 pm - 2:20 pm on Thursday afternoon in Bruner Hall 127. 

DATE

SPEAKER

TITLE/Abstract

4/11/2024 Mr. Shadrach Mbativga

Title: Comparative Analysis Between Runge-Kutta Methods and Multistep Numerical Methods for ODEs

 

Abstract: Single-step numerical methods for Ordinary Differential Equations, such as the Euler's forward method, Improved Euler's method and Runge-Kutta Methods, utilize just one previous value of the solution for an Initial Value Problem(IVP) to approximate a value at the current time step. These methods are generally self-starting, meaning they don't rely on external schemes or models to kick-start the iteration process.

 

Multi-step methods, on the other hand, need multiple previous solution values of the problem to be able to compute a value for the current step. Unlike single-step schemes, multi-step methods usually require another model to kick-start the process, meaning they are not self-starting.

 

In this research work, the researcher is interested in Runge-Kutta methods which is one of the most popular single-step numerical methods known. In order to appreciate its efficiency and capabilities, the researcher compared the performance of Fourth order Runge-Kutta method with Adams-Bashforth-Moulton predictor-corrector method which is a multi-step method that is also very popular in its own rights. In chapter one, the Euler method, which is the most basic numerical method known, is briefly over-viewed. Chapter two deals with the fundamentals of the Runge-Kutta methods and in chapter three, a comparative analysis between these two methods is drawn. A first order ODE is solved over variable step sizes using both methods. A comparison is made in terms of accuracy, rate of convergence, and computational costs of both methods.

4/4/2024 Mr. Belguutei Ariuntugs

Title: Response Surface Methodology Based Hyperparameter Optimization of Actuarial Neural Networks

3/21/2024 Ms. Deborah Okoli

Title: Solution Methods For One-Factor Bond Pricing Model

Abstract: In finance and economic settings, one-factor model denotes the notion that there exists only one Brownian process in the formulation of the short rate model as one source of randomness. In this project work, approximate solution of a one-factor bond pricing model is considered via the application of two proposed solution methods viz: Elzaki Adomian Decomposition Method (EADM) and Laplace Adomian Decomposition Method (LADM). Illustrative examples are considered, and the results are in good agreement when compared with those already in Literature. The methods are effective in this application unlike the classical Elzaki, and Laplace transforms. Hence, they are recommended for other financial models such as the multi-factor, and nonlinear models.  

 

3/7/2024 Ms. Meredith Hall

Title: Introduction into Hopf Algebra

Abstract: In this talk we will cover the background and structure of Hopf Algebras as well as look at the relationships of specific Algebras GL(2) and SL(2)

2/8/2024 Dr. Amy Chambers

Title:  Intro to Labeled Graph Algebras 

 

Abstract:  In this talk, we will introduce definitions and examples involving Labeled Graph C*-Algebras, which are generalizations of Graph C*-Algebras discussed in previous seminars. We will also introduce a graph desingularization process described in an upcoming paper.

2/1/2024 Dr. Gayan Maduranga

Title: Optimization Techniques in Deep Learning

 

Abstract:  This seminar delves into the essential optimization techniques in deep learning, with a primary focus on first-order gradient descent methods, presented through an engaging, hands-on session within a Jupyter Notebook environment. We will explore first-order methods such as Stochastic Gradient Descent (SGD), Momentum, and Adam, shedding light on their crucial roles in enhancing model training efficiency and dynamics. While our practical exercises will concentrate on these first-order techniques, we will also provide an introductory discussion on second-order methods, such as Newton's Method and Quasi-Newton methods like BFGS, to offer insights into their computational intricacies and theoretical underpinnings.

 

This hands-on approach will not only solidify a theoretical understanding of key optimization strategies but also empower attendees with the practical skills needed to apply these techniques to actual deep learning scenarios. Designed to be both accessible for beginners and enriching for experienced practitioners, this seminar aims to enhance attendees' knowledge and practical abilities in applying first-order optimization methods in deep learning, while also providing a foundational understanding of the more complex second-order methods.

1/25/2024 Dr. David Smith

Title: A Introduction to Response Surface Methodology

 

Abstract:  Suppose one wishes to optimize the yield of a chemical process.  There may be many factors such as temperature, pressure, stirring rate, ingredient proportions, etc., that can influence yield.  An experimental design can help determine the major factors to yield.  Once these are identified, we can model yield as a function of these factors and attempt to identify the optimal settings of these factors to maximize yield.  When we have two factors driving yield, we can fit a surface as a function of these factors.   Response surface methodology is one way to achieve this goal.  We will work through this methodology with an example and highlight the software implementation using SAS.

11/14/2023 Dr. Maximilian Pechmann

Title: Bose-Einstein condensation for particles with repulsive short-range pair-interactions in
a Poisson random external potential in Rd
Abstract: We study Bose gases in d dimensions, d2, with short-range repulsive pair-
interactions, at positive temperature, in the canonical ensemble and in the thermodynamic
limit. We assume the presence of hard Poissonian obstacles and focus on the non-percolation
regime. For suciently strong interparticle interactions, we show that almost surely there can-
not be Bose-Einstein condensation into a suciently localized, normalized one-particle state.
The results apply to the canonical eigenstates of the underlying one-particle Hamiltonian.
This is joint work with Joachim Kerner.

11/7/2023 Dr. Motoya Machida

Title: Langevin diffusions and Monte Carlo methodology.

 

 Abstract: Gradient descent is a method of iterative optimization to minimize an objective function f(x). When a noise is added and a step size converges to zero, this iterative algorithm is viewed as a Langevin diffusion. It eventually reaches equilibrium in distribution proportional to exp(-f(x)). In this talk we introduce a notion of ``stochastic flow'' of sample path and ``duality'' of stochastic processes. A dual process, called an ``intertwining dual,'' will play a pivotal role in determining a stopping time (of algorithm) so that we can sample ``exactly'' from the stationary distribution in a finite steps. Thus, it will enable us to perform Monte Carlo simulations. [Demonstration code (bm.R) for this talk and more can be found at vps63.heliohost.us/e-math/MCMC]

 

10/31/2024 Patrick Bartol

Title: Feynman-Kac Formulation of Black-Scholes Option Pricing.

Abstract:  Financial institutions have the ability to sell to buyers derivatives and options. Typical contracts that sell to the buyer the right to purchase stock for a strike price at a future time regardless of the stocks value at that time are called European call options. These institutions rely on the Black-Scholes option pricing to set contract price for European call options. The primary result we investigate is what the Black-Scholes model is and how it can be derived.  We will explore its foundation to understand stochastic differential equations starting with probability theory and Ito calculus. From there, infinitesimal generators are introduced along with related results to deduce the Feynman-Kac formula in order to solve the Black-Scholes option pricing problem. Once this has been examined, we move onto relating the Feynman-Kac formula to the partial differential equation associated to asset portfolio. The solution to this equation via the Feynman-Kac formula becomes the Black-Scholes option pricing.

 

10/24/2023 Dr. Padmini Veerapen

Title:  Looking Ahead with Hope Amidst the Trials and Tribulations of Graduate
School in Mathematics

Abstract:  In this talk, I will address questions graduate students often pose to me when I am either
in attendance at a conference or when I am teaching a class. The questions center around
themes such as survival in graduate school, developing good research writing skills, being
ready for life after graduate school, and establishing collaborations. At conferences or in
classes, my focus is often on the delivery of the mathematics and my goal in this talk, is
to elaborate on some of the tricky non-mathematical issues that come along as we focus
on the mathematics.

10/17/2023 Dr. Kehelwala Dewage Maduranga Title: NCQS: Nonlinear Convex Quadrature Surrogate Hyperparameter Optimization
Abstract
: Ever wanted a taste of the latest in artificial intelligence, without the hefty travel bill to France? Deep learning, a cornerstone of modern artificial intelligence, necessitates optimized hyperparameters as models evolve in complexity. Traditional optimization strategies, with their reliance on smooth loss functions, are often suboptimal for advanced models. To bridge this gap, we introduce the Nonlinear Convex Quadrature Surrogate (NCQS) method for hyperparameter optimization. NCQS employs a data-driven approach, utilizing a convex quadrature surrogate to determine optimal hyperparameters, validated across various benchmarks and datasets. Our method showcases versatility in tasks like automatic target recognition, pushing the boundaries in resource-efficient deep learning and addressing pivotal challenges like computational memory and latency.
10/3/2023 Shelly Forgey Title: Instant Feedback in the Classroom through Learning Catalytics
Abstract
: Are you using a Pearson product this semester, or do you plan on teaching with a Pearson text in the near future? Included in their platform is a versatile product called Learning Catalytics, which utilizes something almost every student has all-too-close at hand: their phone, computer, or tablet. Create your own questions that connect seamlessly to your presentation and give you an immediate snapshot of your students’ understanding. With 18 different question types, including pencil-paper style graphing as well as the usual multiple-choice and numeric, keep your students engaged throughout the lecture with easy-to-prepare modules so that you can deliver questions exactly when you need feedback. You can even deliver a question on-the-fly without being derailed by the blank looks and uncomfortable silence we often experience during lectures. 
(Bring laptop, tablet, or other device for participation)
9/26/2023 Isaac Gyasi Title:  Estimation Comparison of the AR (1) Model Using the Box Jenkin Method and Multilayer Perceptron 
Abstract:
  Forecasting Time Series data has been an important subject in many fields today. The traditional models used in forecasting time series have gained popularity over the past decades. In this paper, a 1000 AR (1) samples of size 100 was created and the next value (prediction) for each sample was predicted using the AR (1) model and an artificial neural network (multilayer perceptron). The density plot for the predictions of both models was plotted and their variances was computed. Using the Uniformly Minimum variance unbiased estimator (UMUVE), the AR (1) model predictions had a less variability as compared to the variability of the predictions for the multilayer perceptron. This speaks well of the conditional maximum likelihood estimator for the AR (1) model, making it more likely to make an estimate which will be close to the true parameter. 
9/19/2023 Jeremy Carew Title: Induction of Diversity in Classifier Ensembles
Abstract
: The issue of predicting the class of an object is common in the machine learning world.  Today, we will briefly discuss the random forest algorithm, one of the most popular algorithms in machine learning.  After, we will discuss a newly proposed method, the "Krypteia" method, and compare/contrast this with the random forest.
9/5/2023 Dr. Damian Kubiak Title: Extreme Diameter 2 Properties in Banach Spaces
Abstract: In this talk we will present extreme version of diameter 2 properties in Banach spaces as well as some related properties.  A Banach space has a diameter 2 property if all members of a class of certain subsets of the unit ball have diameter 2, it has an extreme diameter 2 property if the diameter of 2 is attained.  We will present examples of spaces which possess and do not possess these properties.
8/29/2023 Dr. Amy Chambers Title: Intro to C* - Algebras with Examples
Abstract: 
 

Bold. Fearless. Confident. Kind. Resilient.

Visit us to see what sets us apart.

Schedule Your Visit
Make a Gift to the Department of Mathematics