Amanda Redlich spends much of her time pondering complex, abstract, mathematical questions. As assistant professor of mathematics, she teaches courses in multivariate calculus, probability, and combinatorics and graph theory.

“These ideas, which I’m so used to talking about in abstract ways, can also been seen in the real world the whole time,” she said. “Concepts of allocation algorithms and randomized decision making may sound very highfalutin, but you use them all the time in your daily life without knowing it.”

Redlich recently entertained her faculty colleagues with the lunchtime lecture “Milk and Cookies: Randomness and Decision-making.”

“I was looking for a cute title and milk and cookies go together well, plus I’ve always like them.”

**Why milk and cookies?
**Let’s start with milk. I use the example of buying milk in a grocery store—one where there are several checkouts to choose from. You’ve bought your milk, and whatever else you wanted, you’re in a hurry and you have to decide which line to join. Do you simply join the shortest line? Or the fastest moving? What about the line nearest to you? If there are lots of lines to choose from, should you pick half of them and choose the shortest from those lines?

Anytime you’re trying to make a quick decision like this, to find the short option, you’re mimicking what computer scientists do when they write computer programs. They do it in a more abstract setting, but it’s exactly the same process you’re going through at the grocery store: “OK I’ve got a bunch of options, but want to make up my mind quickly. How do I do it without messing up?”

Consider a search engine like Google. When you type something, a computer somewhere has to generate the web page, and billions of computers are doing that every second, so the program has to decide quickly who is going to do what labor in the quickest manner possible. You’re spreading out the work done by computers using this type of formula.

**What about the cookies?
**I like cookies and I like baking them, so I decided to use that analogy to demonstrate the second type of problem. In this situation you don’t have a lot of information and you have to decide what to do.

Let’s say it’s your first time baking cookies and you have no idea what the recipe should be, but you do know you like the classic chocolate chip variety. You look for popular examples that other cookie bakers have used and try to figure out the most successful recipe. You follow the crowd effectively, so it’s the opposite of the grocery store scenario where you’re looking for the path of least resistance. You’re still maximizing the chances of getting what you want, but using the opposite approach.

Both the grocery store and the cookie analogies are extremely simple ways of describing the principles behind some very complex mathematical formulae, and how they’re used by computer scientists.

**How difficult is it to translate these formulae into real-world applications?**

Personally I find it difficult, because I’m a pure mathematician by training and I cannot do anything in the real world (I consistently lose at tic tac toe!) But as I collaborate with people in the fields of applied mathematics and computer science, I learn from them about how to translate these ideas into technical applications.

The area of my work that is most closely connected to the real world is mathematical biology. This involves trying to solve medical questions, such as what causes a particular disease, when you have only partial information. This is similar to the cookie situation, when you’re trying to choose the best recipe with limited or no prior knowledge of how to bake a cookie.

My work in biology is about using randomness to get a little more information. Conduct a random experiment and that might give you enough information to be more confident about the correct medical decision, because even random actions have patterns and these patterns can supply you with valuable data.

**Conclusions**

The common thread running through my research is the idea of randomness and patterns interacting. Sometimes a lot of different people acting randomly can generate a pattern, even though even each individual person doesn’t know what pattern they’re making.

The other thing that can happen is that I’m looking for the correct answer, trying to find a pattern, and I can use random guesses to locate it. Sometimes I look at random decisions others have made and try to find a mathematical pattern, like in the milk and cookies examples, and sometimes I use randomness in my own analysis as a way to find a pattern, like in the medical example.

I taught my boy, Kevin, algebra. At 5 year old he can solve random algebra equations and made his own. No matter how random the questions my be asked by other, he always break information drown from random questions and rebuilt them base on what he already knew. Last year when he was in 5th grade, He took college placement test at UWS and pass all the algebra requirement even he never study algebra word problem. From my experience, you can change the whole basic math from randomly add, subtract, multiply faction and so on to a very concrete events preschoolers can master them. Algebra and calculus is not an exception. Pure, core Algebra and calculus can be made to be very easy and simple, that my younger kids will master them at 5 year old.