||Tim Poor, Director of Field Application and Sales Engineering, discusses the increase in the amount of data and how Servers Direct can accommodate this growth.
If you think about the data sets Servers Direct worked with 20 years ago or 10 years ago and the amount of data that was created, compared with the data sets nowadays, there is no doubt that data is doubling every 15 weeks. In the last two years, the amount of data has nearly doubled in a world where it took 2000 years to get to that point.
The amount of data that is being generated today is increasing extremely fast and the amount of horsepower that it takes to process that data has to increase to support the amount of data, so 10 years ago, 250GB worth of data was considered a big server.
Today, a server that features 250TB is considered a big server, so a thousand fold increase over 10 years is a pretty remarkable growth rate. The trend is continuing today. The cost of storage continues to decrease, the performance in systems, per Moore's Law (Gordon Moore from Intel), show that data processors will be twice as fast and half as expensive every 18 months. This has been the case since the 1960s and this trend is continuing to push forward. Both performance and capacity are increasing.
As these data sets continue to grow, Servers Direct is looking at new software that is able to accommodate the data sets and the number crunching and extraction of data. With file systems, luster and ZFS, Servers Direct is looking at unique file systems that are really architect and specifically to very large data sets.
We also have different software that's being used to extract data, like Hadoop, in setting up clusters that will pull data out of these very large data sets. And this is happening everywhere, whether it's commercial, governments, institutions, educational or research-type of environments -- Servers Direct is seeing this increase in the growth of data and the horsepower that's needed to manage it.