CentOS Post Mortem & Analysis

Background I manage the crunchtools lab and the infrastructure for this blog similar to a development data center. I have a rigorous weekly checklist, which includes optionally applying operating system patches as they are available. I do not perform the updates every week because of time constraints, but when I do, I patch all of

Red Hat Enterprise Virtualization (RHEV) & Identity, Policy, Audit (IPA)

Background In my ever evolving lab, it came time to integrate Red Hat Enterprise Linux (RHEV) with Identity, Policy, Audit (IPA).There were a few caveats and searching Google didn’t help, so hopefully this article can save you some time. Integrating the two was fairly straitforward. The biggest challenge was finding a quick and easy way

Devops with Bash

This presentation walks through some of the software development processes used to develop Shiva, a mass ssh client. The development of Shiva was used as a teaching aid for a class taught at Ohio Linux Fest 2012. Shiva was used to demonstrate concepts such as good command line interface design, testing, configuration design and advanced features of Bash, such as background functions. This presentation and associated class were developed to help systems adminsitrators think more like developers.

RHEL6 and Cisco WRVS4400N Networking

Background This weekend I decided to upgrade my home network with a Cisco WRVS4400N wireless router. Like a typical router it can provide standard wireless services WPA2, DHCP, etc, but the this model also provides support for four distinct VLANs and four distinct SSIDs. This has allowed me to create separate networks for work, play,

OpenSSL Certificate Authority

Background Recently, I discovered how to use the openssl provided CA script to create a certificate authority and self signed certificates. Traditionally, I had ran all fo the commands manually. When using the CA script it is critical to understand the underlying security concepts. Certificate Authority Openssl has infrstructure to create a long lived Certificate

Designing a Robust Monitoring System

Reading Ted Dziuba’s article Monitoring Theory article, I was reminded of several conventions that I have developed over the years to help with monitoring servers, network devices, software services, batch processes, etc. First, break down your data points into levels so that you can decide how to route them. Second avoid interrupt driven technology like email, it lowers your productivity and prohibits good analysis techniques.

The Logs Are an Approximation of Reality

The logs are an approximation of reality and they cannot be taken as canonical or gospel. This is true in several senses. Logs can give insight to the standard investigative questions of who, what, when, where, and why, but almost always requires other information to truly answer all of these questions.

Today, Postfix reiterated this lesson for me. I had a problem where our gateway mail server couldn’t deliver mail to a peer. The receiving mail server kept bouncing the email address with a 550 even though the mailbox being delivered to was real and active. Gmail, Yahoo, and MSN would all accept email from our gateway, but this one provider would not accept email. Of course, it wasn’t a simple problem. We had a web server running Apache/PHP delivering to the local Sendmail server which forwarded to a Post fix gateway server, which then tried to deliver to an Exim server which received for the destination email address.

I am not going to dig into all of the details, but of course, the first thing I did was go to the logs. The problem is, the logs were wrong! In the following examples, the users and domains in the logs have been changed to be anonymous, but the logs are real.

Decade of Storage: Analysis of Data Costs

Yesterday, I noticed this interesting tidbit from Rackspace calculating the cost of data over the last Decade of Storage. Of course, there a few bumps in the road that made me chuckle. Interestingly, in the last couple of years it plots the cost from $0.40/GB to $0.06/GB. This ties together a whole bunch of things that I have thought about over the last couple of years. First, now is a wonderful time to be a user buying storage for personal audio and video. Second, regular people are going to have to start to learn data management strategies. Finally, this cost isn’t even close to what it is for me in my data center. It is easy for us to celebrate the cheap cost of raw storage while loosing track of the total cost of ownership for data. I will elaborate.