Anne made me use that.
Anyway, I’d continued to feel underutilized–I was spending an inordinate amount of time reading email, or watching logs be parsed into mysql (which is about as interesting as watching paint dry, and somewhat less interesting than watching mold grow).
This was depressing and frustrating. I’m here, and although I am being paid, I’m dealing with being away from home and putting off projects for other customers and so forth, and I’m not being paid enough to endure that _and_ be idle.
So I made the most of an opportunity and had The Talk with my boss. I really didn’t candy-coat anything–I told him about the issues I saw with the organization, and how its insularity and lack of documentation makes it hard for new people to come in be productive, etc.
He took it quite well–in fact, it mostly seemed to reinforce things he already knew or suspected. We talked about where to go next.
And then we had our weekly noon-on-Tuesday meeting, and I decided to play Kerry rather than Bush–I volunteered.
I came out of the meeting with several tasks of varying complexity, and got one of them (building a custom omnibus report for RT) mostly finished within an hour or two.
And then, later, I ended up taking on An Important Project–basically, fixing something that an outside contractor had been working on that wasn’t operating as required. I think Doug (my boss) was impressed that I guessed at why the script for parsing some logs was running a day behind and, it was later confirmed in discussions with the programmer, nailed it. It’s vitally important that this gets done in a timely fashion, and if I get it done, which I will, I suspect I will get to be Hero for a Day.
I _am_ having to combat, as part of this, the fact that they’re scared of their database.
As background, I’m used to pushing databases really far–AnteSpam’s front-end servers run a continuous load of more than 1 query (perhaps as many as 5) *per second*, and hold hundreds of thousands or even millions of records, while the back-end database that collects together all the statistical information holds many millions of records without breaking a sweat (though its query load is much lighter).
At the i-squad, we do some insane stuff with the database–each page view and many team-specific graphic views all require at least one database request, and some of the more complex functionality requires several more, sometimes very, very tough ones. Now, some of that does need to be relieved–almost certainly through clever use of memcached–but we’re not running a really insane server.
Here, though, when I proposed stuffing this logging information into the database–about 130K records per day–everyone’s knee-jerk reaction was that this was an excessive load.
Of course, I then pointed out that I was loading twice as many records as day as that as part of dealing with the web logs.
I can’t imagine not trusting your database with that sort of volume. The real problem, though, are the consequences: as a result of this fear, instead of stuffing stuff in a big database where they can easily access and correlate and report, they have Berkeley db files shoved on multiple disparate servers, or work with lists in text files, or any number of other un-integrated ways of looking at their data.
The costs are enormous, albeit hidden–I suspect they could make a lot of additional connections if they didn’t have to work so hard to get the data where they need it, in a form they can use it.