Big Data is one of the hot topics of these years. I’m not planning to explain what is “Big Data”, the photo below explains its current status well enough 🙂
Regardless of what we understand from “Big Data”, the important point is:
There is an increasing need for storing bigger data and being able to use/analyze it “fast” enough.
Companies already started to suffer from long query execution times on traditional relational databases having only a few terabytes of data (much smaller than big data definition). Instead of targeting for “real-time” results, many companies moving from 2-3 hours to 8-10 hours of executions. Most companies avoid promising instant results to customers.
Is it sustainable? Is it good for competitiveness?
It depends on the needs. If longer execution time decreases your productivity, decreases customer satisfaction or decreases company profit, then you need to invest in proper database systems.
Fortunately, the current processing power of server systems already good enough to generate instant results and even do complex calculations on the fly. But, it is not only a matter of infrastructure and hardware architecture. It is also mainly dependent on software architecture, database design, efficient algorithms, and logical design.
Big Organizations already adopted Big Data
I see Google is one the most successful organizations in implementing big data infrastructure. Imagine the size of the data behind its search engine. After you pressed “search” button, have you ever seen a screen with any wait icon? The results come immediately. Same in Google Analytics, which is even capable of showing real-time analytics.
I’m not expecting every company to invest in such a robust infrastructure, which may not be cost-effective for all.
Instead, such as cloud services, I believe big data storage & analytics providers will become popular and highly demanded.
While outsourced vendors can provide technical infrastructure, I advise companies investing in internal human resources for analysis of big data.