Watch Hypernet Labs CTO Todd Chapman and data scientist Matthew Gasperetti run the same R script on a brand new 8-core i9 with 32 gb of ram and then on a Galileo-enhanced Raspberry Pi. Here are the runtime results (spoiler alert):
- 1 minute 16 seconds on the i9
- 30 seconds on the Raspberry Pi
How? And why? In a world of supercomputers and vast data centers, why would we experiment with a credit card-sized, single-board computer developed to teach basic computer science, in the first place? The point is simple: anyone can now access state-of-the-art computing capabilities, without literally having state-of-the-art computing hardware at their fingertips.
All you need to do is drag and drop your code. Galileo does the rest.
As you can see in the clip, Matt carries out the speed test using R to run a linear regression, set to run 250,000 iterations, on both machines. The Raspberry Pi emerges victorious because it is supercharged by Galileo software. Galileo streamlines Matt’s computational pipeline, providing him with immediate, seamless access to a 96-core Google Cloud instance. The i9 doesn’t stand a chance.
Five to ten years ago, most of us stored our files on laptops and desktops. Companies like Dropbox, Box, and others have since made it possible to migrate most of this data to the cloud. An analogous shift is now taking place in the realm of data analytics and scientific computing, and Galileo is leading the movement. In another five years, we will all be running our computationally intensive jobs on the cloud instead of our local machines.
But this can only happen if data scientists are empowered to access cloud without pursuing second careers in IT infrastructure. Note the one thing you didn’t see in the video–a lengthy and complicated cloud setup procedure. That’s because Galileo eliminates this entirely.
We’re entering a universe of ubiquitous compute and invisible computing infrastructure. Join us.