# Sitemap

A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.

## Pages

## Posts

## On Monads, Monoids and Endofunctors 1: The monoid

** Published:**

# Preface

## Stumbling backwards into np.random.seed through jax.

** Published:**

Alternative title: PRNG for you and me through `(j)np.random.seed`

. This post aims to (briefly) discuss why I like `jax`

and then compare Jax and `numpy`

vis a vis randomness.

## PyTorch Gradient Manipulation 1

** Published:**

**Preface**: this notebook is part 1 in a series of tutorials discussing gradients (manipulation, stopping, etc.) in `PyTorch`

. The series of tutorials cover the following network architectures:

## A Machine Learning oriented introduction to PALISADE, CKKS and pTensor.

** Published:**

Note: “we” means “I”

## MAPE Madness

** Published:**

****Problem setup**:** You want to use the Mean Absolute Precision Error (**MAPE**) as your loss function for training **Linear Regression** on some forecast data. Springer: Mean Absolute Precision Error (MAPE)) has found success in forecasting because it has desirable properties:

## Fundamentals Part 1: An intuitive introduction to Calculus and Linear Algebra

** Published:**

As you’ve probably heard, calculus is imperative for Machine Learning. However, there is a definite emphasis on differentiation compared to integration, so this series of posts will build from simple derivatives to Jacobians and Hessians. Ideally, at the end of this series, if you read a paper that mentions one of the topics above, you’ll have a rough idea of why the authors chose to do what they did and what their choice means for the results.

## Fundamentals Part 2: Hessians and Jacobians

** Published:**

This section builds off the last post, Fundamentals Part 1: An intuitive introduction to Calculus and Linear Algebra; if you’re not familiar with calculus or linear algebra, I highly recommend starting there. If this is your first time seeing all of this, know that this section is more involved than the first fundamentals post. Be prepared to feel a little lost, but if you keep at it, I know you’ll get there (it took me a while to wrap my head around)

## Pusheen The Limit

** Published:**

Note: The code can be found here: quitPusheenMeAround

## Ian Quah - Initial post

** Published:**

Any good story starts with a **why** so we begin there. I’m not particularly skilled at learning new concepts; I often get lost and have to go back to read even simple things to refresh my memory. So I’m hoping that this blog will serve as:

## portfolio

## Portfolio item number 1

Short description of portfolio item number 1

## Portfolio item number 2

Short description of portfolio item number 2

## publications

## APPLE: Automatic patch pattern labeling for explanation

Published in *AAAI 2018 New Orleans*, 2018

**Summary**: We proposed a new method for explainability in Convolutional Neural Network Architectures

Download here

## Self-Driving Database Management Systems

Published in *Conference on Innovative Data Systems Research*, 2020

**Summary**: We explored constructing a self-driving database management system: Peloton DB

Download here

## Openfhe: Open-source fully homomorphic encryption library

Published in *WAHC 2022: Workshop on Encrypted Computing & Applied Homomorphic Cryptography*, 2022

**Summary**: I was involved in the engineering and design of the library.

Download here

## reviews

## Fall 2023 Quarter Class Review

** Published:**

My first quarter at UW! **Summary**: adapting to life in academia from the tech industry.

## Winter 2024 Quarter Class Review

** Published:**

My second quarter at UW! **Summary**: grant writing takes longer than you think.

## talks

## Thoughts on the Brain and Machine Learning: Biological Plausibility, SNNs, and beyond

** Published:**

I covered a bit of Neuroscience, discussed the plausibility of backpropagation in the brain, highlighted spiking neural networks and analyzed the Neural Gradient Representation by Activity Differences (NGRAD) algorithm.

## Homomorphic Encryption for Encrypted Machine Learning

** Published:**

I led the hands-on discussion and code walkthrough where attendees were given a hands-on introduction to FHE-backed Machine Learning. I started with a naive first-pass implementation, then walked them through how they might optimize their own code, leading us to an optimized logistic regression implementation

## An Informal Introduction to Monoids, Functors and Monads in the Context of Machine Learning

** Published:**

I gave an experimental talk where I chatted about how concepts in category theory, primarily monoids functors and monads lead to better, cleaner code.

## Encrypted Machine Learning

** Published:**

I led the hands-on discussion and code walkthrough where attendees were given a hands-on introduction to FHE-backed Machine Learning. I started with a naive first-pass implementation, then walked them through how they might optimize their own code, leading us to an optimized logistic regression implementation

## teaching

## Software Carpentry

Workshop, *Software Carpentry 2024, eScience Institute*, 2024

Assistant for Software Carpentry in Python