Anthony Ricigliano - News and Articles by Anthony Ricigliano: We probably didn’t see it coming but the decline of education in the U.S has been crystallized by a book titled “Battle Hymn of the Tiger Mother” written by Amy Chua. The book and its author have come under intense criticism for a tough love on steroids stance on parenting. The book’s premise is that American’s coddle their children at the risk of settling for something which is far less than their full potential.
Combined with the December release of the latest test results from the Program for International Student Assessment (PISA), “Battle Hymn of the Tiger Mother” has put an unflattering spotlight on primary and secondary education in the U.S. as well as American parenting styles.
The results from the PISA test showed that American had slipped badly versus those in other countries, finishing in 17th place overall. Students from the U.S. ranked 17th in reading, 23rd in science and 31st in math, getting squashed by students from Shanghai who won each category by a comfortable margin.
President Obama called the dismal scores a “Sputnik moment” as the realization sunk in that the U.S. had fallen behind badly in a race that it comfortably ruled for years.
The stunning results from the Shanghai students have at their foundation a few simple facts; “Chinese students work harder, with more focus, for longer hours than American students do”. According to Time Magazine, “Chinese students already have a longer school year than American pupils — and U.S. kids spend more time sitting in front of the TV than in the classroom.” The education race is reminiscent of the rivalry between the U.S and Russia which spanned from sports to space as well as the later rivalry with Japan which was focused largely on the financial arena.
The U.S is still the biggest economy in the world but China is coming fast, having taken the second spot from Japan in 2009. With the U.S. mired in an anemic recovery from its real estate induced recession, China’s economy is on a tear. Not only is their economy growing exponentially, they’re the biggest of America’s creditors with the Treasury Department estimating that our total debt to China is approximately $843 billion. That is over $10,000 in debt for the average American family and just a fraction of our total debt of $14 trillion.
Despite the outcry from American parents, there are elements of Amy Chua’s tough love regimen that are supported by studies in psychology and cognitive science. Her criticism that American parents over-protect their children from distress is backed by a book fittingly called “A Nation of Wimps”, authored by Hara Estroff Marano. Marano states "Research demonstrates that children who are protected from grappling with difficult tasks don't develop what psychologists call 'mastery experiences. Kids who have this well-earned sense of mastery are more optimistic and decisive; they've learned that they're capable of overcoming adversity and achieving goals. Children who have never had to test their abilities grow into "emotionally brittle" young adults who are more vulnerable to anxiety and depression.”
Judging by the ubiquitous advertisements for depression meds, it looks like the “Tiger Mom” is on to something.
Author Anthony Ricigliano
ANTHONY RICIGLIANO - Read current news and posts by author Anthony Ricigliano
Wednesday, February 23, 2011
Tuesday, February 22, 2011
BP Pays a Crony - News by Anthony Ricigliano
Anthony Ricigliano - Latest News: BP's much publicized compensation fund for Gulf oil spill victims has received over 91,000 requests for final damage settlement payments from people and businesses across the Gulf but has only issued one. Give them credit, it was for a hefty sum of $10 million dollars but it comes with one caveat; the recipient is an existing BP business partner that was paid only after BP intervened on their behalf.
BP has not divulged the name of the recipient due to disclosure issues. BP has admitted that it went to bat for their partner to make it the first one to be paid from the $20 billion compensation fund, known as “The Gulf Coast Claims Facility”. The amount paid to BP’s partner company dwarfs the stopgap payments which have been parceled out while people wait for final settlements.
Another galling aspect of the $10 million payment is that final payments to the remaining 91,000 claimants won’t begin until February at the earliest, according to the administrator of the facility. The fund's administrator is Washington lawyer Kenneth Feinberg, whose firm is being paid $850,000 per month to run the facility. To no one’s surprise Feinberg’s firm has come under intense criticism as well by lawmakers, plaintiffs’ attorneys and claimants who have repeatedly complained that Feinberg’s the facility has no transparency, is running at the behest of BP, has shortchanged claims, and is dragging its feet on payments. You can’t blame Feinberg for this; he is currently negotiating with BP to revamp the pay structure and extend the administration of the facility through August 2013.
While a BP spokesperson claimed that the funding facility "reviewed our positions and made an independent decision regarding the outcome of the claim," Feinberg was independent enough to say on Monday that the facility was never requested to review the claim that BP lobbied for. Feinberg stated that BP struck an outside deal with the business and told the fund to make the payment.
Feinberg told AP that, "At the request of the parties, the settlement reached between BP and the other party was paid out of the GCCF fund. It was a private settlement and we paid it, but we were not privy to the settlement negotiations between BP and that party.”
There is also an appeals process which runs through the U.S. Coast Guard for disgruntled claimants, but that agency isn’t doing claimants any favors either. Of 264 appeals that have been processed, the finding in every one of them was that the facility acted correctly in either denying claims or paying small fraction of what was requested.
It appears that residents and businesses that incurred damages from BP’s blowout are still being held in complete disregard by the people with the money. It’s still rough, unless of course you’re a crony of BP.
Author Anthony Ricigliano
BP has not divulged the name of the recipient due to disclosure issues. BP has admitted that it went to bat for their partner to make it the first one to be paid from the $20 billion compensation fund, known as “The Gulf Coast Claims Facility”. The amount paid to BP’s partner company dwarfs the stopgap payments which have been parceled out while people wait for final settlements.
Another galling aspect of the $10 million payment is that final payments to the remaining 91,000 claimants won’t begin until February at the earliest, according to the administrator of the facility. The fund's administrator is Washington lawyer Kenneth Feinberg, whose firm is being paid $850,000 per month to run the facility. To no one’s surprise Feinberg’s firm has come under intense criticism as well by lawmakers, plaintiffs’ attorneys and claimants who have repeatedly complained that Feinberg’s the facility has no transparency, is running at the behest of BP, has shortchanged claims, and is dragging its feet on payments. You can’t blame Feinberg for this; he is currently negotiating with BP to revamp the pay structure and extend the administration of the facility through August 2013.
While a BP spokesperson claimed that the funding facility "reviewed our positions and made an independent decision regarding the outcome of the claim," Feinberg was independent enough to say on Monday that the facility was never requested to review the claim that BP lobbied for. Feinberg stated that BP struck an outside deal with the business and told the fund to make the payment.
Feinberg told AP that, "At the request of the parties, the settlement reached between BP and the other party was paid out of the GCCF fund. It was a private settlement and we paid it, but we were not privy to the settlement negotiations between BP and that party.”
There is also an appeals process which runs through the U.S. Coast Guard for disgruntled claimants, but that agency isn’t doing claimants any favors either. Of 264 appeals that have been processed, the finding in every one of them was that the facility acted correctly in either denying claims or paying small fraction of what was requested.
It appears that residents and businesses that incurred damages from BP’s blowout are still being held in complete disregard by the people with the money. It’s still rough, unless of course you’re a crony of BP.
Author Anthony Ricigliano
Presenting Your Private Company to Investors
Anthony Ricigliano - Business News and Advice by Anthony Ricigliano: If the time has come to raise funding to expand your business, you’re likely to be presenting your business to a variety of investors. Assuming that you are past the “friends and family” funding stage, you could end up presenting to investors referred to you by your friends and family or to angel investing syndicates. First of all, your company either has a product/service or has something in the concept phase. Either way, there are points to be made and mistakes to avoid. One of the biggest mistakes business owners make is over-emphasizing how great an idea their product/service is. Don’t get me wrong, differentiating yourself from the competition is important. The problem here is, quite frankly, your idea is probably being pursued by other companies right now. If it’s a really great idea, there will more people chasing it in a few weeks or months.
Here’s another crack in the “My idea is so great that we’ll take over the world” pitch. Getting a patent for it may or may not protect you. If a patent isn’t allowed or doesn’t protect you for some other reason, that’s one thing. If it does, you may be taking on a problem that kills your company anyway; a long drawn-out court battle.
Don’t toss up your hands and walk away yet. There is a way to differentiate your business, impress investors, and realize your business’ potential; focus on execution. A detailed roadmap of how you’re going to outwork and execute better than your competition is what is going to matter both to your potential investors and to your company.
It’s quite possible that the reason you started your business is that you see endless potential with opportunities dovetailing out to other endless opportunities. You see the market as broad and deep with revenues sitting out there for the taking. Here’s another mistake to avoid; spending more time on the huge potential that exists from these dovetailing markets as opposed to the opportunity that exists in the short term. It doesn’t matter if the first market opportunity is infinitesimal compared to the downstream markets, your potential investors are going to want to hear how your company is going to grow on a step by step basis.
Next, presenting your business as having no competition may sound great but a space with no competition really isn’t a space at all. An investor hearing that there’s no competition should immediately wonder if a market exists and, if it does, ask why no one is addressing it. Having the answer to a question that isn’t being asked is a sure way to lose an investor and a lot of time waiting for that market to develop, if it ever does. A great example of this type of situation is Corning’s “Gorilla Glass” which was patented in 1962 and sat on the shelf for almost half a century before markets developed in high tech and high definition televisions. Corning could afford the wait but that luxury isn’t available to startups. Competition in a space confirms that there is a market, now it’s up to you to out-execute the other players that are already out there.
Author Anthony Ricigliano
Here’s another crack in the “My idea is so great that we’ll take over the world” pitch. Getting a patent for it may or may not protect you. If a patent isn’t allowed or doesn’t protect you for some other reason, that’s one thing. If it does, you may be taking on a problem that kills your company anyway; a long drawn-out court battle.
Don’t toss up your hands and walk away yet. There is a way to differentiate your business, impress investors, and realize your business’ potential; focus on execution. A detailed roadmap of how you’re going to outwork and execute better than your competition is what is going to matter both to your potential investors and to your company.
It’s quite possible that the reason you started your business is that you see endless potential with opportunities dovetailing out to other endless opportunities. You see the market as broad and deep with revenues sitting out there for the taking. Here’s another mistake to avoid; spending more time on the huge potential that exists from these dovetailing markets as opposed to the opportunity that exists in the short term. It doesn’t matter if the first market opportunity is infinitesimal compared to the downstream markets, your potential investors are going to want to hear how your company is going to grow on a step by step basis.
Next, presenting your business as having no competition may sound great but a space with no competition really isn’t a space at all. An investor hearing that there’s no competition should immediately wonder if a market exists and, if it does, ask why no one is addressing it. Having the answer to a question that isn’t being asked is a sure way to lose an investor and a lot of time waiting for that market to develop, if it ever does. A great example of this type of situation is Corning’s “Gorilla Glass” which was patented in 1962 and sat on the shelf for almost half a century before markets developed in high tech and high definition televisions. Corning could afford the wait but that luxury isn’t available to startups. Competition in a space confirms that there is a market, now it’s up to you to out-execute the other players that are already out there.
Author Anthony Ricigliano
Friday, February 18, 2011
Anthony Ricigliano - A Movement Based on a Number - Anthony Ricigliano Blog
Anthony Ricigliano - News by Anthony J Ricigliano: The challenges facing the citizens that want to take action on global warming are many. First and foremost, there’s the fossil fuel industry, which is the single most profitable enterprise in human history. Challenges dovetail off from the industry due to its money to advance their political agenda and the fact that oil and coal are relatively cheap sources of energy. By the way, part of the reason oil is so cheap is that Big Oil and Big Coal get to dump their byproducts into the atmosphere free of charge.
One of the other challenges is defining what exactly global warming is. The lack of that definition has made the presentation of a convincing global warming argument a complicated issue. Granted, we can easily see that extreme events are occurring on a more frequent basis but defining why is a little fuzzier. Saying that greenhouse gases are to blame may be correct but without a data point the debate can be run all over the board, done as much to confuse the issue as anything else. It’s kind of like saying speeding is dangerous without actually having a speed limit on which to base the conversation.
This confusion has allowed the agendas of politicians and big oil to be pushed forward even while extreme weather events occur seemingly on a weekly basis. Politically, it now appears that President Obama is now making any concession possible to get re-elected. This includes concessions on the regulation of carbon emissions as monitored by the EPA. The administration is now backing away from proposed regulations to avoid being seen as anti-business and anti-jobs as framed by the Republican Party.
The good news is that a data point has been determined on which we can now define what level of carbon emissions is too high, much like adding a speed limit to a conversation about the dangers of excessive speed. The data point was determined by the planet's foremost climatologist, James Hansen, who found that any carbon value higher than 350 parts per million in the atmosphere was "not compatible with the planet on which civilization developed and to which life on earth is adapted."
That number serves to give global warming a black and white reference point to start working on. The problem is that carbon levels in the atmosphere now measure 390 parts per million, about 11% higher than the level we need to maintain life as we know it. This number has fostered a movement known as 350.org which is now mobilizing people that are interested in saving the planet from global warming.
This mobilization includes the coordination of almost 15,000 global warming demonstrations in 188 countries. Foreign Policy magazine called the demonstrations “the largest ever coordinated global rally" about any issue, ever. If you’re concerned about global warming, 350.org is definitely a great place to start.
Anthony Ricigliano
One of the other challenges is defining what exactly global warming is. The lack of that definition has made the presentation of a convincing global warming argument a complicated issue. Granted, we can easily see that extreme events are occurring on a more frequent basis but defining why is a little fuzzier. Saying that greenhouse gases are to blame may be correct but without a data point the debate can be run all over the board, done as much to confuse the issue as anything else. It’s kind of like saying speeding is dangerous without actually having a speed limit on which to base the conversation.
This confusion has allowed the agendas of politicians and big oil to be pushed forward even while extreme weather events occur seemingly on a weekly basis. Politically, it now appears that President Obama is now making any concession possible to get re-elected. This includes concessions on the regulation of carbon emissions as monitored by the EPA. The administration is now backing away from proposed regulations to avoid being seen as anti-business and anti-jobs as framed by the Republican Party.
The good news is that a data point has been determined on which we can now define what level of carbon emissions is too high, much like adding a speed limit to a conversation about the dangers of excessive speed. The data point was determined by the planet's foremost climatologist, James Hansen, who found that any carbon value higher than 350 parts per million in the atmosphere was "not compatible with the planet on which civilization developed and to which life on earth is adapted."
That number serves to give global warming a black and white reference point to start working on. The problem is that carbon levels in the atmosphere now measure 390 parts per million, about 11% higher than the level we need to maintain life as we know it. This number has fostered a movement known as 350.org which is now mobilizing people that are interested in saving the planet from global warming.
This mobilization includes the coordination of almost 15,000 global warming demonstrations in 188 countries. Foreign Policy magazine called the demonstrations “the largest ever coordinated global rally" about any issue, ever. If you’re concerned about global warming, 350.org is definitely a great place to start.
Anthony Ricigliano
Wednesday, February 16, 2011
Anthony Ricigliano News - Amazing and Depressing Stats on our National Debt
Anthony Ricigliano - Anthony Ricigliano Business Advice: The national debt is the amount that the United States has borrowed and is currently paying interest on. The national debt of the U.S. is now over $14 trillion, a number that is larger than the gross domestic product of China, the United Kingdom, and Australia combined.
Here’s a list stats that are amazing but that you may not want to read:
* In 2010, the United States accumulated over $3.5 billion in new debt each and every day. That’s more than $2 million per minute.
* The cost of executing the wars in Iraq and Afghanistan is well over $1 trillion and counting.
* The Treasury Department estimates that our debt to China is approximately $843 billion and counting.
* According to the January 2010 Congressional Budget Office (CBO) report, the federal budget deficit in 2009 was $1.4 trillion (9.9% of GDP). The 2010 deficit was approximately $1.3 trillion (9.1% of GDP). Not since 1945 have deficit been that high relative to GDP.
* According to the March 2010 CBO report, at proposed spending levels, the national debt will increase to 90% of GDP by 2020, at about $20 trillion.
* The government is also borrowing from itself, having borrowed from Social Security and Medicare, which have had surpluses.
* In 2009, according to the CBO, $187 billion of tax receipts were used to pay interest on the national debt. This is interest only and does nothing toward reducing the debt.
* The share of the national debt for each employed American is more than $90,000.
Recession and extended war efforts have exacerbated the numbers attached to the national debt as tax receipts have decreased while war and entitlement spending have increased. It’s entirely possible that these expenditures could be decreased (by ending the war effort) while tax receipts increase in an improving economy.
The issue at this point is that both parties seem intent on blowing up the national debt regardless of the factors in play at the present time. Politicians seem intent on continually delivering the message to their constituents that spending can continue and that we’ll deal with the debt monster at a later date. This shifts the debt burden to future generations who will suffer as the debt piled on by earlier generations consumes the lion’s share of the country’s GDP. It seems that everyone is living for today while leaving the bill for our kids, grandchildren, and the generations that follow.
Author Anthony Ricigliano
Here’s a list stats that are amazing but that you may not want to read:
* In 2010, the United States accumulated over $3.5 billion in new debt each and every day. That’s more than $2 million per minute.
* The cost of executing the wars in Iraq and Afghanistan is well over $1 trillion and counting.
* The Treasury Department estimates that our debt to China is approximately $843 billion and counting.
* According to the January 2010 Congressional Budget Office (CBO) report, the federal budget deficit in 2009 was $1.4 trillion (9.9% of GDP). The 2010 deficit was approximately $1.3 trillion (9.1% of GDP). Not since 1945 have deficit been that high relative to GDP.
* According to the March 2010 CBO report, at proposed spending levels, the national debt will increase to 90% of GDP by 2020, at about $20 trillion.
* The government is also borrowing from itself, having borrowed from Social Security and Medicare, which have had surpluses.
* In 2009, according to the CBO, $187 billion of tax receipts were used to pay interest on the national debt. This is interest only and does nothing toward reducing the debt.
* The share of the national debt for each employed American is more than $90,000.
Recession and extended war efforts have exacerbated the numbers attached to the national debt as tax receipts have decreased while war and entitlement spending have increased. It’s entirely possible that these expenditures could be decreased (by ending the war effort) while tax receipts increase in an improving economy.
The issue at this point is that both parties seem intent on blowing up the national debt regardless of the factors in play at the present time. Politicians seem intent on continually delivering the message to their constituents that spending can continue and that we’ll deal with the debt monster at a later date. This shifts the debt burden to future generations who will suffer as the debt piled on by earlier generations consumes the lion’s share of the country’s GDP. It seems that everyone is living for today while leaving the bill for our kids, grandchildren, and the generations that follow.
Author Anthony Ricigliano
Friday, February 11, 2011
Virtual Storage - By Anthony Ricigliano
Author Anthony Ricigliano - News and Articles by Anthony Ricigliano: While it’s true that information is king, he’s definitely a greedy ruler! As the business world continues to demand the storage of more and more data for longer periods of time, the need for increased amounts of disk space grows exponentially larger each year. To compound the issue, the low price of storage space means that many software developers no longer feel the need to make their products space efficient, and government regulations seem to increase legislative requirements for the retention of critical information each year. As the business units see the price tag on servers and disk space become more affordable, they can’t understand why adding just one more should be a problem. They fail to recognize that the cost of a growing computer room includes more than just the initial cost of the storage units.
The Shocking Cost of Maintaining Storage Units
Most non-IT workers would be shocked to find out that the cost of managing each storage unit can be as much as four to 10 times the original purchase price. In addition to putting a big dent in the IT budget, ever increasing storage units lead to server sprawl and a constantly declining operating efficiency. Increased maintenance can also be disruptive, expensive, and burdensome to the entire enterprise. To solve this problem, system engineers have been working on file virtualization methods to eliminate these issues. Their goal is to reduce storage and server inefficiencies while permitting infinite growth. Let’s take a look at exactly how they intend to accomplish this lofty goal.
Breaking the Tight Connection between Clients, Servers, and Storage
The old strategy of tightly coupling storage space with clients and servers is a big reason that adding a new storage unit becomes expensive to maintain. When machines from a variety of vendors are added to the network, they may not all integrate seamlessly creating individual islands of storage to manage. When applications are physically mapped to a specific server for storage, any changes, including additions, require modifications to this complex mapping algorithm. In some cases, adding a new device or moving a system to a storage unit with more space requires expensive and annoying downtime. This often leads to an under-utilization of the actual storage space, an expensive proposition, because system administrators over-allocate space to minimize the need to take an outage. To break free from this outdated methodology, file virtualization relies on the ability to remove this static mapping process to allow storage resources to freely move between applications as needed without restricting access to the data.
Adding a Layer of Intelligent Design to the Network
File virtualization adds a layer of intelligence to the network to decouple logical data access from the physical retrieval of the actual files. This separates the application and the client from the physical storage devices so that static mapping is no longer needed. With this change, the existing bank of servers can be maintained without disrupting the core system or the user’s access to valuable information. After implementing a file virtualization strategy, many IT shops find that they can consolidate storage units and increase their overall utilization. In this way, they may be able to simplify the system configuration by decommissioning older storage devices that are no longer needed or that they can go much longer than anticipated without adding additional disk space.
In today’s IT world, most shops are finding that using a file virtualization system is not only a “best practice," it’s a must-do to continue operating. IT shops with budgets that continued to rise each year just a short time ago are seeing their available funds shrink more and more each year. With increasing pressure to reduce costs or keep the flat, file virtualization is also a virtual requirement.
Anthony Ricigliano
The Shocking Cost of Maintaining Storage Units
Most non-IT workers would be shocked to find out that the cost of managing each storage unit can be as much as four to 10 times the original purchase price. In addition to putting a big dent in the IT budget, ever increasing storage units lead to server sprawl and a constantly declining operating efficiency. Increased maintenance can also be disruptive, expensive, and burdensome to the entire enterprise. To solve this problem, system engineers have been working on file virtualization methods to eliminate these issues. Their goal is to reduce storage and server inefficiencies while permitting infinite growth. Let’s take a look at exactly how they intend to accomplish this lofty goal.
Breaking the Tight Connection between Clients, Servers, and Storage
The old strategy of tightly coupling storage space with clients and servers is a big reason that adding a new storage unit becomes expensive to maintain. When machines from a variety of vendors are added to the network, they may not all integrate seamlessly creating individual islands of storage to manage. When applications are physically mapped to a specific server for storage, any changes, including additions, require modifications to this complex mapping algorithm. In some cases, adding a new device or moving a system to a storage unit with more space requires expensive and annoying downtime. This often leads to an under-utilization of the actual storage space, an expensive proposition, because system administrators over-allocate space to minimize the need to take an outage. To break free from this outdated methodology, file virtualization relies on the ability to remove this static mapping process to allow storage resources to freely move between applications as needed without restricting access to the data.
Adding a Layer of Intelligent Design to the Network
File virtualization adds a layer of intelligence to the network to decouple logical data access from the physical retrieval of the actual files. This separates the application and the client from the physical storage devices so that static mapping is no longer needed. With this change, the existing bank of servers can be maintained without disrupting the core system or the user’s access to valuable information. After implementing a file virtualization strategy, many IT shops find that they can consolidate storage units and increase their overall utilization. In this way, they may be able to simplify the system configuration by decommissioning older storage devices that are no longer needed or that they can go much longer than anticipated without adding additional disk space.
In today’s IT world, most shops are finding that using a file virtualization system is not only a “best practice," it’s a must-do to continue operating. IT shops with budgets that continued to rise each year just a short time ago are seeing their available funds shrink more and more each year. With increasing pressure to reduce costs or keep the flat, file virtualization is also a virtual requirement.
Anthony Ricigliano
Thursday, February 10, 2011
Virtualization for the Dynamic Enterprise
Anthony Ricigliano News - Business Advice by Anthony Ricigliano:
What does Server Virtualization Mean?
Server virtualization is the use of technology to separate software, including the operating system, from the hardware. This means that you can run several environments on the same physical server. In some installations, this could mean that several identical operating systems are run on the same machine. Other shops could decide to run a Windows platform, a Linux system, and an UNIX environment on a single server.
Advantages of Server Virtualization
In today’s demanding business environment, server virtualization offers many different advantages. Not only does virtualization allow servers and data to be more mobile than ever, it also provides a cost-effective way to balance flat or shrinking budgets. The following list details the major benefits:
• Consolidation – Most large servers run applications that only take up a small percentage of their processing power. Even busy software packages usually only have small peak times that utilize over 50% of their CPU capabilities. The rest of the time, the capacity is unused. By virtualizing the server so that additional systems can take advantage of under-utilized resources, IT shops can increase their return-on-investment (ROI). Although some companies have reported a consolidation ratio as high as 12:1, most shops can easily show a 3:1 to 4:1 rate.
• Decreased Footprint – By decreasing the number of physical servers, the size of the computer room can be reduced and utility costs should decrease.
• Lower Hardware Costs – The utilization of a higher percentage of existing hardware resources will reduce the total number of physical servers that are needed. This will save money on the upfront expense of purchasing hardware and the long-term cost of maintenance.
• Flexibility – Server virtualization allows an IT shop to be much more flexible. Instead of waiting for new hardware to arrive before implementing a new system, a new virtual server can be created on an existing machine. This also provides a more flexible method for migration and disaster recovery.
• Easier Testing and Development – Historically, IT installations have used separate physical servers for their development, acceptance testing, and production environments. With virtualization, it is an easy process to create either different or identical operating environments on the same server. This allows developers to compare performance on several different environments without impacting the stability of the production system.
Virtualization and Disaster Recovery
The growth in both international business and large-scale natural disasters has many organizations closely analyzing their disaster recovery plans and general hardware malfunction procedures. In either event, it is critical to be back up and running in a very short period of time. Most modern IT shops require consistent up-time 24-hours a day to maintain their core operations, or their business will be severely impacted. Both reliability and accessibility are greatly improved when server virtualization is used to its fullest potential.
By reducing the total number of servers needed to duplicate the production environment, it is much less expensive to create and test an off-site disaster recovery environment. Hardware, space, and backup expenses are dramatically reduced. It’s easy to see how setting up 30 or 40 pieces of hardware would be both easier and cheaper than configuring 100 items.
Along the same lines, a hardware malfunction will be less of an issue with server virtualization. While many more systems will run on the same piece of hardware, most shops find that they can easily duplicate physical servers for automatic rollover in the event of a hardware failure when they virtualize.
Major Virtualization Products
While there are always smaller players in any new technology, VMware and Microsoft Virtual Server are the biggest providers of server virtualization products.
• VMware offers the free VMware Server package or the more robust VMware ESX and ESXi products. Systems that are virtualized by VMware products are extremely portable and can be installed on virtually any new piece of hardware with a low incidence of complications. The system can be suspended on one machine, moved to another one, and immediately resume operations at the suspense point when restarted.
• Microsoft Virtual Server is a virtualization product that works best with the Windows operating systems, but can also run other systems like the popular Linux OS.
Anthony Ricigliano
What does Server Virtualization Mean?
Server virtualization is the use of technology to separate software, including the operating system, from the hardware. This means that you can run several environments on the same physical server. In some installations, this could mean that several identical operating systems are run on the same machine. Other shops could decide to run a Windows platform, a Linux system, and an UNIX environment on a single server.
Advantages of Server Virtualization
In today’s demanding business environment, server virtualization offers many different advantages. Not only does virtualization allow servers and data to be more mobile than ever, it also provides a cost-effective way to balance flat or shrinking budgets. The following list details the major benefits:
• Consolidation – Most large servers run applications that only take up a small percentage of their processing power. Even busy software packages usually only have small peak times that utilize over 50% of their CPU capabilities. The rest of the time, the capacity is unused. By virtualizing the server so that additional systems can take advantage of under-utilized resources, IT shops can increase their return-on-investment (ROI). Although some companies have reported a consolidation ratio as high as 12:1, most shops can easily show a 3:1 to 4:1 rate.
• Decreased Footprint – By decreasing the number of physical servers, the size of the computer room can be reduced and utility costs should decrease.
• Lower Hardware Costs – The utilization of a higher percentage of existing hardware resources will reduce the total number of physical servers that are needed. This will save money on the upfront expense of purchasing hardware and the long-term cost of maintenance.
• Flexibility – Server virtualization allows an IT shop to be much more flexible. Instead of waiting for new hardware to arrive before implementing a new system, a new virtual server can be created on an existing machine. This also provides a more flexible method for migration and disaster recovery.
• Easier Testing and Development – Historically, IT installations have used separate physical servers for their development, acceptance testing, and production environments. With virtualization, it is an easy process to create either different or identical operating environments on the same server. This allows developers to compare performance on several different environments without impacting the stability of the production system.
Virtualization and Disaster Recovery
The growth in both international business and large-scale natural disasters has many organizations closely analyzing their disaster recovery plans and general hardware malfunction procedures. In either event, it is critical to be back up and running in a very short period of time. Most modern IT shops require consistent up-time 24-hours a day to maintain their core operations, or their business will be severely impacted. Both reliability and accessibility are greatly improved when server virtualization is used to its fullest potential.
By reducing the total number of servers needed to duplicate the production environment, it is much less expensive to create and test an off-site disaster recovery environment. Hardware, space, and backup expenses are dramatically reduced. It’s easy to see how setting up 30 or 40 pieces of hardware would be both easier and cheaper than configuring 100 items.
Along the same lines, a hardware malfunction will be less of an issue with server virtualization. While many more systems will run on the same piece of hardware, most shops find that they can easily duplicate physical servers for automatic rollover in the event of a hardware failure when they virtualize.
Major Virtualization Products
While there are always smaller players in any new technology, VMware and Microsoft Virtual Server are the biggest providers of server virtualization products.
• VMware offers the free VMware Server package or the more robust VMware ESX and ESXi products. Systems that are virtualized by VMware products are extremely portable and can be installed on virtually any new piece of hardware with a low incidence of complications. The system can be suspended on one machine, moved to another one, and immediately resume operations at the suspense point when restarted.
• Microsoft Virtual Server is a virtualization product that works best with the Windows operating systems, but can also run other systems like the popular Linux OS.
Anthony Ricigliano
Subscribe to:
Posts (Atom)