The demand for data centers is on the rise. As online commerce continues to grow, businesses are trying to meet the demand it places on their IT infrastructures. Implementing and maintaining in-house data solutions is an expensive and time-consuming proposition. Many companies are turning to data centers to help alleviate those costs so they can focus on their core operations.
A data center facility is a secure location where companies can store valuable IT equipment. Data centers are equipped with a power infrastructure to deal with outages. Data centers also involve the use of security protocols that protect data and equipment integrity. In addition, data centers offer a place for customers to work if they need dedicated office space. For example, after a disaster, companies can relocate their employees to a data center to continue running daily operations in a comfortable environment. (more…)
In today’s world, green is becoming the colour of choice for a number of business sectors – and the IT industry is no exception.
The advent of green initiatives driven by the government, such as the Carbon Reduction Commitment Scheme (CRCEES), ensures that thousands of UK businesses are now monitoring their energy usage in a way never previously seen.
As buyers continue to focus on environmentally friendly products and services, going green doesn’t just provide a competitive advantage, it’s a necessity.
It’s no surprise then to learn, that many IT departments are looking at ways to not only enjoy the benefits of lower energy bills, but are also attempting to be somewhat kinder to the environment. Having a green data centre is a great way of reducing those all-important carbon emissions, so what can be done to make a data centre environmentally friendly?
In this post I will outline the process of setting up an OpenVPN installation, configured to use the PKI in order to authenticate the clients and the rounting setup in order for the clients to use the VPN as an internet gateway.
In order to be able to follow this tutorial you will need some basic knowledge of linux and it tools, and a basic understanding of how computer networks function.For this exercise I will be using my laptop as the server…. (more…)
World Wide Web at first was designed to deliver static documents (.html), and later to become routine in almost all businesses as well as in ordinary live, as a platform serving sophisticated interactive applications, displacing desktop binary applications, dependable on operating systems, libraries, frameworks, e.t.c.
Let’s go back few decades, in the 1960s, the dawn of modern computing as we know it today. Computer programs were stored and executed on a central computer, known as mainframe. Users access the mainframe remotely through computer terminals. Main advantages of client-server architecture are:
Recently, I worked on WEB based application, which has to display and print invoices in HTML format. Invoices has to be printed on sheets with already printed logo and company details.
Nowadays, more and more sites are dynamic web applications, driven by WordPress, Joomla, Drupal… or other free, open source CMS, as well as custom “closed source” CMS. All of them are more or less vulnerable. According to Sam Ransbotham, as stated in “An Empirical Analysis of Exploitation Attempts based on Vulnerabilities in Open Source Software” open source applications are more vulnerable then close source.