A Google employee has broken the world record for calculating pi just in time for the mind-bogglingly long number's special day.
Emma Haruka Iwao spent four months working on the project in which she calculated pi to 31.4 trillion digits.
Pi holds a special place in the realm of math. It's an irrational number that continues infinitely without repetition. You calculate it by dividing a circle's circumference by its diameter.
Iwao did her number crunching primarily from Google's office in Osaka, Japan, where she works at as a developer and advocate for Google Cloud. Fittingly, she used 25 Google Cloud virtual machines to generate the enormously long number. It's the first pi record calculated on the cloud.
Her milestone was certified by Guinness World Records on Wednesday, making her the third woman to set a world record for calculating the number. Iwao broke the record for pi set by Peter Trueb in 2016, which was 22.4 trillion digits long.
Google made the announcement on March 14 (3.14), which is known as Pi Day. The semi-official holiday for the unique number is celebrated by eating actual pies.
"It was my childhood dream, a longtime dream, to break the world record for pi," Iwao told CNN Business. She has been working toward this moment since she was 12, when she first downloaded software to calculate pi on her personal computer.
Iwao said she had help with the final calculation from Alexander Yee, who invented a program called "y-cruncher" for computing pi and other constants. Her former professor and one time world record holder for pi, Daisuke Takahashi, helped her with advice and technical strategies.
Beyond being a convenient way to promote Google's own cloud products, Iwao's new record shows how far cloud computing technology has come.
All 31,415,926,535,897 digits of her pi calculation can be downloaded by anyone who wants to experiment with the data. In the past, if you wanted to share the longest known version of pi, you had to put it on a hard drive and mail it.
"We keep investing in the cloud and it gets even better over time," said Iwao. "Hopefully we can do an even bigger computation in the future."