Hello 0.4 GHz clockrate increase~. Also some associated topics, and general stuff.

May 23, 2008 03:01

Before:


After:


The numbers don't add up in that, but that's because AMD's 'lower speed when idling' thingie was active for the first (it defaults to twice that frequency) and apparently nonworking in the second - I've since updated the driver (odd to think of processers having drivers), and now the Opteron slows down when not busy. It seems to like dropping its multiplier to 5 for a total of about 1GHz, as opposed to the 2.4GHz it has when operating at full power. Though, potentially, that's not its true limit; I hear some of these can be overclocked very well (3GHz sometimes, and not even using any special cooling), but I tend not to mess around with that; the only reason the numbers in those pictures aren't exactly their stock values is that Umbra-tan's motherboard seems to like pushing them a little slightly unless I specifically tell it not to.

In terms of benefit, it does seem to have made starting Firefox a bit faster; due to my habit of sessionsaving ~200 tabs, that tends to take a little while, so a change in power is noticeable. Thing is, though, now I'm kinda wondering if it's worth swapping out my graphics card (a Radeon 9800 Pro, which was top of the line... in about 2004 or so) for a Radeon 3850. It'd certainly improve 3D performance*, but unless I do a lot of gaming or Blender-ing, it may not be worth the ~£150** or so to do it. Other factors are that it'd hardware accelerate h264 video and such (minor point, as that's not an issue with Umbra-tan at the moment), and that, as an AGP card***, I'd not get any reuse out of it with a new computer. Though with that said, Umbra's been my partner long enough that I'd not really want to cannibalise her for parts anyway...

That's another thing: my next computer would be built from scratch again, which would be quite a significant investment of money, particularly if I get to implement all the things I'd want to (a liquid cooling system is perhaps the most notable, but for that I'm considering making a custom case too). This means I have no idea when I'd be able to do it; student loan money wouldn't easily cover it unless I was very careful, and I don't really want to ask parents for a significant amount, so I may need to wait until I have some form of job.

I could look into work over the summer, but... this is potentially the last one where I don't need to, since my fourth year of Uni would also be my final one. It might be selfish (do I have ojousama tendencies or something?), but I've tried to avoid work for as long as possible, on the grounds that, after I'm out of education, I'll need to spend the rest of my life working.

...Which brings up the subject of jobs. I still don't know what area I want to go into, though I'm thinking programming might not be one of them. It'd depend on the area, I guess, but writing programs is turning out to be something I can do, but not really a strength. By which I mean, I'm competent, but nowhere near the best, and as such there must be some other area I can go into that'd really exploit what I'm good at.
I might be being a little harsh on myself there, though. Maybe it's just me failing to really get into programming the way others can, or maybe it's the way I don't specialise in being a CompSci person. Or perhaps I'm just comparing too much against the highly skilled (two people in my group project team were really good, but I might've been better than the other two) and forgetting how I compare to the average programmers. Or it might just be that I'm used to dealing with people less skilled than those who come to the CompSci course at Bristol, which is, after all, rather high-rated.
Overall, though, I've started to get the feeling that this isn't an area I excel at, which has left me uncertain. I'm kinda used to being good at the stuff I choose to do... ^^;

That's probably roughly enough for the moment. I'll add this, though: I did okay at the databases exam, since much of the material was fairly non-technical stuff ("write down the one-line definition of a database" was on there), and one bit was covered in WebTech lectures. I'd've done better if I'd actually gone over SQL syntax before the exam, but... well, the result shouldn't be too bad, considering I tried to teach myself most of the course in part of one night. =P

I also dropped AI (technically "AI and logic programming") at the last moment, since I was doing one more unit than I needed to. To be honest, I should maybe have done so a lot earlier; it was clear quite a while ago that it wasn't what I was after, considering it was about logic and prolog more than it was about, say, decision-making algorithms in games.

This leaves cryptography as the other unit I need to do well in; to be honest, taking that was probably also a mistake, since it's quite mathematical and is far more time-consuming than it has any right to be (people proposed doubling it's worth as a unit, claiming it has enough material to justify it), and I'd probably do better if I'd taken something else. What's done is done, though...

* The 9800 Pro was good in its day, but there have been many better cards since then. The 3850, on the other hand, is the second best ATI currently has to offer, and should, at worst, make the AGP interface the bottleneck.
** This is factoring in an aftermarket cooler, probably a Zalman VF-900. If nothing else, the artwork on graphics card coolers (and packaging) kinda tends to rub me the wrong way. =P
*** For those unfamiliar with graphics hardware: firstly, you're still reading? Unexpected! Secondly, AGP is a physical interface that is mostly obsolete these days due to being replaced by PCI-express, which offers a higher rate of data transfer, or something.
Previous post Next post
Up