Portal:Storage

From Wikibon

(Difference between revisions)
Jump to: navigation, search
Line 1: Line 1:
<meta name="description" content="Portal page for all data storage information on wikibon" />
<meta name="description" content="Portal page for all data storage information on wikibon" />
<meta name="title" content="Storage Portal: Online Data Storage Management" />
<meta name="title" content="Storage Portal: Online Data Storage Management" />
 +
 +
The Wikibon Data Storage Portal contains data storage industry research, articles, expert opinion, and data storage company profiles.
__NOTOC__
__NOTOC__
Line 24: Line 26:
</p>
</p>
[[Storage virtualization design and deployment | read more...]]
[[Storage virtualization design and deployment | read more...]]
-
|}[[Category:Business compliance]][[Category: NAS]][[Category: Storage Networks wikitips]][[Category: Storage and business compliance]][[Category: Storage consolidation]][[Category: Storage networks]][[Category: Wikitips]]
+
|}[[Category:Business compliance]][[Category: Green storage]][[Category: NAS]][[Category: Storage Networks wikitips]][[Category: Storage and business compliance]][[Category: Storage consolidation]][[Category: Storage networks]][[Category: Sustainability wikitips]][[Category: Wikitips]]

Revision as of 18:21, 15 September 2009

The Wikibon Data Storage Portal contains data storage industry research, articles, expert opinion, and data storage company profiles.


Wikitip

Data Analysis

How do you run capacity planning for Big data? Capacity planning should be explore more than just calculating the percentage and experience.

It should be more mathematical calculation of every byte of the data sources coming into the system. How about designing a predictive model which will confirm my data growth with an accuracy until 10 years? How about involving business to confirm the data growth drivers and feasibility of future born data sources ? Why don’t consider compression factor and purging into the calculation to reclaim the space for data grow. Why we consider only disk utilization and why there is no consideration about other hardware resources like memory, processor, cache? After all, it is all about data processing environment. I think this list of consideration can still grow…. Explore complete write up on: http://datumengineering.wordpress.com/2013/02/15/how-do-you-run-capacity-planning/

View Another Wikitip

Featured Case Study

Virtualization Energizes Cal State University

John Charles is the CIO of California State University, East Bay (CSUEB) and Rich Avila is Director, Server & Network Operations. In late 2007 they were both looking down the barrel of a gun. The total amount of power being used in the data center was 67KVA. The maximum power from the current plant was 75kVA. PG&E had informed them that no more power could be delivered. They would be out of power in less than six months. A new data center was planned, but would not be available for two years.

read more...

Storage Professional Alerts


Featured How-To Note

Storage Virtualization Design and Deployment

A main impediment to storage virtualization is the lack of multiple storage vendor (heterogeneous) support within available virtualization technologies. This inhibits deployment across a data center. The only practical approach is either to implement a single vendor solution across the whole of the data center (practical only for small and some medium size data centers) or to implement virtualization in one or more of the largest storage pools within a data center.

read more...

Personal tools