Bringing Social To The Kernel

Imagine a world where you can login to your computer once and have full access to all of the functionality in your computer, plus seamless access to all of the web sites you visit on a daily basis. No more logging into each site individually, your computer’s operating system takes care of that for you.

That world may be coming quicker than you realize. I was listening to a recent episode of the PaulDotCom security podcast today. In this episode, they interviewed Jason Fossen, a SANS Security Faculty Fellow and instructor for SEC 505: Securing Windows. During the conversation, Jason mentioned some of the changes coming to the next version of Microsoft’s flagship operating system, Windows 8. What he described was, in a word, horrifying…

Not much information is out there about these changes yet, but it’s possible to piece together some of it. Jason mentioned that Windows 8 will have a broker system for passwords. Basically, Windows will store all of the passwords necessary to access all of the various services you interact with. Think something along the lines of 1Password or LastPass. The main difference being, this happens in the background with minimal interaction with the user. In other words, you never have to explicitly login to anything beyond your local Windows workstation.

Initially, Microsoft won’t have support for all of the various login systems out there. They seem to be focusing on their own service, Windows Live, and possibly Facebook. But the API is open, allowing third-parties to provide the necessary hooks to their own systems.

I’ve spent some time searching for more information and what I’m finding seems to indicate that what Jason was talking about is, in fact, the plan moving forward. TechRadar has a story about the Windows 8 Credential Vault, where website passwords are stored. The credential vault appears to be a direct competitor to 1Password and LastPass. As with other technologies that Microsoft has integrated in the past, this may be the death knell for password managers.

ReadWriteWeb has a story about the Windows Azure Access Control Service that is being used for Windows 8. Interestingly, this article seems to indicate that passwords won’t be stored on the Windows 8 system itself, but in a centralized “cloud” system. A system called the Access Control Service, or ACS, will store all of the actual login information, and the Windows 8 Password Broker will obtain tokens that are used for logins. This allows users to access their data from different systems, including tablets and phones, and retain full access to all of their login information.

Microsoft is positioning Azure ACS as a complete claims-based identity system. In short, this allows ACS to become a one-stop shop for single sign-on. I log into Windows and immediately have access to all of my accounts across the Internet.

Sounds great, right? In one respect, it is. But if you think about it, you’re making things REALLY easy for attackers. Now they can, with a single login and password, access every system you have access to. It doesn’t matter that you’ve used different usernames and passwords for your bank accounts. It doesn’t matter that you’ve used longer, more secure passwords for those sensitive sites. Once an attacker gains a foothold on your machine, it’s game over.

Jason also mentioned another chilling detail. You’ll be able to login to your local system using your Windows Live ID. So, apparently, if you forget your password for your local user, just login with your Windows Live ID. It’s all tied together. According to the TechRadar story, “if you forget your Windows password you can reset it from another PC using your Windows Live ID, so you don’t need to make a password restore USB stick any more.” They go on to say the following :

You’ll also have to prove your identity before you can ‘trust’ the PC you sync them to, by giving Windows Live a second email address or a mobile number it can text a security code to, so anyone who gets your Live ID password doesn’t get all your other passwords too – Windows 8 will make you set that up the first time you use your Live ID on a PC.

You can always sign in to your Windows account, even if you can’t get online – or if there’s a problem with your Live ID – because Windows 8 remembers the last password you signed in with successfully (again, that’s encrypted in the Password Vault).

With this additional tidbit of information, it would appear that an especially crafty attacker could even go as far as compromising your entire system, without actually touching your local machine. It may not be easy, but it looks like it’ll be significantly easier than it was before.

Federated identity is an interesting concept. And it definitely has its place. But, I don’t think tying everything together in this manner is a good move for security. Sure, you can use your Facebook ID (or Twitter, Google, OpenID, etc) already as a single login for many disparate sites. In fact, these companies are betting on you to do so. This ties all of your activity back to one central place where the data can be mined for useful and lucrative bits. And perhaps in the realm of a social network, that’s what you want. But I think there’s a limit to how wide a net you want to cast. But if what Jason says is true, Microsoft may be building the equivalent of the One Ring. ACS will store them all, ACS will verify them, ACS will authenticate them all, and to the ether supply them.

Playbox Three-Sixt-Wii!

I was fortunate enough to obtain both a PlayStation 3 and an Xbox 360 recently. I’ve owned a Wii since it was launched back in 2006. The Wii was always relegated as a non-contender in the “next-gen” console wars. The Wuu has definitely held its own, however, effectively carving its own niche. Instead of concentrating on graphics and processor technology, they went in a completely different direction, creating a new way to play games with their innovative controller.

I have been a Playstation guy for a while. While I never owned the original Playstation, I did get the Playstation 2 on the day it was released. The PS2 was the clear winner in the previous generation of consoles, handily beating the Nintendo 64 and Dreamcast. Microsoft was late to the game with the original Xbox, which didn’t seem to do very well. The Playstation 3 has been a powerhouse since it was released. It clearly has better graphics then any other system out there, and the processing power of the system is incredible. Games on this thing look incredible, but despite this, I don’t think Sony is doing very well.

The Xbox 360 is a pretty decent machine, despite the red-ring issue they initially had. It doesn’t have the power or graphical prowess of the PS3, but it does have a pretty strong backing. I’m not a hardcore Microsoft hater, but I’m not exactly a fan either. I’ve essentially moved on from Windows and I use either Linux or OS X now. Despite this, I’ve been drawn to the Xbox 360 for some time now. I had avoided purchasing one, but then, I had avoided purchasing a PS3 as well. Since getting both a PS3 and an Xbox 360, however, I’ve noticed that I’m drawn more towards the 360 and I’ve grown curious as to the reasoning behind this. I think I’ve finally identified it.

If you want your platform to do well, you need to build a community around it. Microsoft’s Xbox team has done this, in spades. Marketing is one thing, and there is a massive marketing force behind the 360, but community can really make or break things. The PS3 has a little bit of a community, mostly centering around the PS blog. Nintendo’s community is virtually nonexistent. But the 360 community is just huge and engaging. Major Nelson and his team do an incredible job promoting the 360 while keeping their content entertaining and diverse. The 360 itself encompasses a ton of community building with a stream of new content about new games, videos, and music.

I think Microsoft’s Xbox team has clearly won this round of the console wars. The advent of the Kinect and the Move, round two is clearly on its way. The Kinect seems to be out to an early lead, however, with the Move being mostly ignored as a copy of the Wii motion controllers. Nintendo doesn’t seem to have a play in this latest round, though one could argue they were first to market when they initially launched.

I enjoy playing all three consoles, but the Xbox clearly seems to be winning in my home. Microsoft is doing an incredible job thus far with the Xbox and I’m hoping they continue they way they’re going.

 

Digital Armageddon

April 1, 2009. The major media outlets are all over this one. Digital Armageddon. The end of computing as we know it. Again. But is it? Should we all just “Chill Out?”

So what happens April 1, 2009? Well, Conficker activates. Well, sort of. It activates the latest revision of its auto-update algorithm, switching the number of domains it can find updates on from 250 per day to 50,000 per day. Conficker, in its current form, isn’t really malicious beyond techniques to prevent detection. In order to become malicious, it will need to download an update to the base code.

There are two methods by which Conficker will update its base code. The first method is to download the code via a connection to one of the 50,000 domains it generates. However, it does not scan all 50,000 domains at once. Instead, it creates a random list of 500 of the 50,000 generated domains and scans them for an update. If no update is found, Conficker sleeps for 24 hours and starts over by generating a new list of 50,000 domains, randomly picking 500, and contacting them for an update. The overall result of this is that it becomes nearly impossible to block all of the generated domains, increasing the likelyhood that an update will get through. On the flip side, this process appears that it would result in a very slow spread of updates. It can easily take days, weeks, or months for a single machine to finally stumble upon a live domain.

The second method is to download the code via a peer-to-peer connection between infected hosts. As I understand it, the peer-to-peer mechanism has been active since revision C of Conficker has been in the wild. This mechanism allows an update to spread from system to system in a very rapid manner. Additionally, based on how the peer-to-peer mechanism works, it appears that blocking it is difficult, at best.

So what is the risk here? Seriously, is my computer destined to become a molten heap of slag, a spam factory, or possibly a zombie soldier in a botnet attack against foreign governments? Is all hope lost? Oh my , are we all going to die!

For the love of all things digital, pull it together! It’s not as bad as it looks! First off all, if you consistently update your machines and keep your anti-virus up to date, chances of you being infected are very low. If you don’t keep up to date, then perhaps you should start. At any rate, fire up a web browser and search for a Conficker scanner. Most of the major anti-virus vendors have one. Make sure you’re familiar with the company you’re downloading the scanner from, though, a large number of scam sites have popped up since Conficker hit the mainstream media.

If you’re a network admin, you have a bigger job. First, I’d recommend any windows machines you are responsible for are patched. Yes, that includes those machines on that private network that is oh-so impossible to get to. Conficker can spread via samba shares and USB keys as well. Next, try scanning your network for infections. There are a number of Conficker scanners out there now thanks to the Honeynet Project and Dan Kaminsky. I have personally used both the proof-of-concept python scanner, as well as the latest version of nmap.

If you’re using nmap, the following command line works quite well and is incredibly fast :

nmap -sC –script=smb-check-vulns –script-args=safe=1 -p139,445 \
-d -PN -n -T4 –min-hostgroup 256 –min-parallelism 64 \
-oA conficker_scan

Finally, as a network admin, you should probably have some sort of Intrusion Detection System (IDS) in place. Snort is an open source IDS that works quite well and has a large community following. IDS signatures exist to detect all known variants of Conficker.

So calm down, take a deep breath, and don’t worry. I find it extremely unlikely that April 1 will result in anything more than a blip in network activity. Instead, concentrate on detection and patching. Conficker isn’t Skynet…. Yet.

 

Bad crawler, no cookie!

My wife is a professional SEO consultant with her own business. I work with her on occasion, helping out with the server end of things. It’s fun and challenging, and I think we work pretty well together.

So, the other day she comes to me with an odd question. Why is Google Analytics suddenly showing a high bounce rate for new keywords? Interesting problem, of course. One of the first things that popped into my mind was either a blackhat SEO or a rival of some sort. It sounds paranoid, but it does happen.

So I pulled the access logs and started pouring through them. Since the bounce rate came from a keyword search, it was easy enough to locate the offending entries. There were hundreds of log entries, all coming from the same 65.55.0.0/16 address space. A couple more seconds of digging showed that 65.55.0.0/16 was owned by Microsoft. Reverse DNS on some of the IPs revealed that these IPs were all part of the MSN web crawler. MSN apparently doesn’t provide reverse DNS for all of their IPs. No matter, there were enough to prove that this was MSN. Here’s an example from the log:

65.55.110.195 – – [24/Mar/2009:03:08:05 -0400] “GET /index.html HTTP/1.0” 200 58838 “http://search.live.com/results.aspx?q=keyword” “Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; SV1; .NET CLR 1.1.4322)”

So what in the world is going on here? Why are we getting pounded by hundreds upon hundreds of requests from the MSN crawler? And why is the MSN crawler reporting itself as Internet Explorer 6.0? The referrer URL showed the source of the request to be from a live.com search, but these being crawler addresses, I’m willing to bet this was programmed in rather than a result of an actual search. It doesn’t really matter, though, because whatever it is, it’s causing a high bounce rate and really screwing up the site statistics. The high bounce rate may be affecting the Google ranking as well.

Before we blocked these requests, though, we wanted to make sure this was unwanted behavior, so we started digging for info. One of the pages we came across described the same behavior we were seeing. As it turns out, this strange activity is intended. Live.com claims they do this to detect cloaking. Of course, it was quite easy to identify these IPs as coming from Microsoft, and determine (rather quickly) that they are sourcing from a search engine. It would be very simple to broaden any cloaking to include those IPs, making this crazy technique useless.

Microsoft claims they are continuing to tune their crawler to reduce the spam and make the keywords more relevant. The point is, though, that this seems to hurt more than it helps. As a result, many webmasters are blocking the referrer spam, at risk of having MSN blacklist the site. We have followed suit, deeming both MSN and Live.com to be irrelevant search engines.

Of course, if someone out there has a better idea of how to handle this, I’m listening…

 

Windows 7… Take Two… Or Maybe Three?

Well, looks like the early information on Windows 7 might be wrong.  According to an interview with Steven Sinofsky, Senior Vice President of Windows and Windows Live Engineering at Microsoft, there are a few details you may have heard that may not be entirely true.  But then again, it seems that Mr Sinofsky did tap dance around a lot of the questions asked.

First and foremost, the new kernel.  There has been a lot of buzz about the new MinWin kernel, which many believe to be integral to the next release of Windows.  However, according to the interview, that may not be entirely true.  When asked about the MinWin kernel, Mr Sinofsky replied that they are building Windows 7 on top of the Windows Server 2008 and Windows Vista foundation.  There will be no new driver compatibility issues with the new release.  When asked specifically about the minimum kernel, he dodged the question, trying to focus on how Microsoft communicates, rather than new features of Windows.

So does this mean the MinWin kernel has been cut?  Well, not necessarily, but I do think it means that we won’t see the MinWin kernel in the form it has been talked about.  That is, very lightweight, and very efficient.  In order to provide 100% backwards compatibility with Vista, they likely had to add a lot more to the kernel, moving it from a lightweight, back into the heavyweight category.  This blog post by Chris Flores, a director at Microsoft, seems to confirm this as well.

The release date has also been pushed back to the original 2010 date that was originally stated.  At a meeting before the Inter-American Development Bank, Bill Gates had stated that a new release of Windows would be ready sometime in the next year or so.  Mr Sinofsky stated firmly that Windows 7 would be released three years after Vista, putting it in the 2010 timeframe.

Yesterday evening, at the All Things Digital conference, a few more details leaked out.  It was stated again that Windows 7 would be released in late 2009.  Interestingly enough, it seems that Windows 7 has “inherited” a few features from it’s chief competitor, Mac OSX.  According to the All Things Digital site, there’s a Mac OS-X style dock, though I have not been able to find a screenshot showing it.  There are these “leaked” screenshots, though their authenticity (and possibly the information provided with them) is questionable at best.

The biggest feature change, at this point, appears to be the addition of multi-touch to the operating system.  According to Julie Larson-Green, Corporate Vice President of Windows Experience Program Management, multi-touch has been built throughout the OS.  So far it seems to support the basic feature-set that any iPhone or iPod Touch supports.  Touch is the future, according to Bill Gates.  He went on to say:

“We’re at an interesting junction.  In the next few years, the roles of speech, gesture, vision, ink, all of those will become huge. For the person at home and the person at work, that interaction will change dramatically.”

All in all, it looks like Windows 7 will just be more of the same.  With all of the problems they’ve encountered with Vista, I’ll be surprised if Windows 7 becomes the big seller they’re hoping for.  To be honest, I think they would have been better off re-designing everything from scratch with Vista, rather than trying to shovel in new features to an already bloated kernel.

Useful Windows Utilities? Really?

Every once in a while, I get an error that I can’t disconnect my USB drive because there’s a file handle opened by another program.  Unfortunately, Windows doesn’t help much beyond that, and it’s left up to the user to figure out which app and shut it down.  In some cases, the problem persists even after shutting down all of the open apps and you have to resort to looking through the process list in Task Manager.  Of course, you can always log off or restart the computer, but there has to be an easier way.

In Linux, there’s a nifty little utility called lsof.  The name of the utility, lsof, is short for List Open Files, and it does just that.  It displays a current list of open files, including details such as the name of the program using the file, it’s process ID, the user running the process, and more.  The output can be a bit daunting for an inexperienced user, but it’s a very useful tool.  Combined with the power of grep, a user can quickly identify what files a process has open, or what process has a particular file open.  Very handy for dealing with misbehaving programs.

Similar tools exist for Windows, but most of them are commercial tools, not available for free use.  There are free utilities out there, but I hadn’t found any that gave me the power I wanted.  That is, until today.

I stumbled across a nifty tool called Process Explorer.  Funnily enough, it’s actually a Microsoft tool, though they seem to have acquired it by purchasing SysInternals.  Regardless, it’s a very powerful utility, and came in quite handy for solving this particular problem.

 

In short, I had opened a link in Firefox by clicking on it in Thunderbird.  After closing Thunderbird, I tried to un-mount my USB drive, where I have Portable Thunderbird installed, but I received an error that a file was still open.  Apparently Firefox was the culprit, and closing it released the handle.

The SysInternals page on Microsoft’s TechNet site list a whole host of utilities for debugging and monitoring Windows systems.  These can be fairly dangerous in the hands of the inexperienced, but for those of us who know what we’re doing, these tools can be invaluable.  I’m quite happy I stumbled across these.  The closed nature of Windows can be extremely frustrating at times as I cannot figure out what’s going on.  I’m definitely still a Linux user at heart, but these tools make using Windows a tad more bearable.

Ooh.. Bad day to be an IIS server….

Web-based exploits are pretty common nowadays.  It’s almost daily that we heard of sites being compromised one way or another.  Today, it’s IIS servers.  IIS is basically a web-server platform developed by Microsoft.  It runs on Windows-based servers and generally serves ASP, or Active Server Pages, dynamic content similar to that of PHP or Ruby.  There is some speculation that this is related to a recent security advisory from Microsoft, but this has not been confirmed.

Several popular blogs, including one on the Washington Post, have posted information describing the situation.  There is a bit of confusion, however, as to what exactly the attack it.  It appears that the IIS servers were infected by using the aforementioned vulnerability.  Other web servers are being infected using SQL injection attacks.  So it looks like there are several attack vectors being used to spread this particular beauty.

Many of the reports are using Google searches to estimate the number of infected systems.  Estimates put that figure at about 500,000, but take that figure with a grain of salt.  While there are a lot affected, using Google as the source of this particular metric is somewhat flawed.  Google reports the total number of links found referring to a particular search string, so there may be duplicated information.  It’s safe to say, however, that this is pretty widespread.

Regardless of the method of attack, and which server is infected, an unsuspecting visitor to the exploited site is exposed to a plethora of attacks.  The malware uses a number of exploits in popular software packages, such as AIM, RealPlayer, and iTunes, to gain access to the visitor’s computer.  Once the visitor is infected, the malware watched for username and password information, reporting that information back to a central server.  Both ISC and ShadowServer have excellent write-ups on both the server exploit as well as the end-user exploit.

Be careful out there, kids…

Microsoft wants to infect your computer?!?

There’s an article over at New Scientist about a “new” technique Microsoft is looking at for delivering patches.  Researchers are looking into distributing patches through a network similar to that of a worm.  These ‘friendly’ worms would use advanced strategies to identify and ‘infect’ computers on a network, and then install the appropriate patches into that system.

On one hand, this looks like it may be a good idea.  In theory, it reduces load on update servers, and it may help to patch computers that would otherwise go un-patched.  Microsoft claims that this technique would spread patches faster and reduce overall network load.

Back in 2003, the now infamous Blaster worm was released.  Blaster took advantage of a buffer overflow in Microsoft’s implementation of RPC.  Once infected, the computer was set to perform a SYN flood attack against Microsoft’s update site, windowsupdate.com.

Shortly after the release of Blaster, a different sort of worm was released, Welchia.  Welchia, like Blaster, took advantage of the RPC bug.  Unlike blaster, however, Welchia attempted to patch the host computer with a series of Microsoft patches.  It would also attempt to remove the Blaster work, if it existed.  Finally, the worm removed itself after 120 days, or January 1, 2004.

Unfortunately, the overall effect of Welchia was negative.  It created a large amount of network traffic by spreading to other machines, and downloading the patches from Microsoft.

The Welchia worm is a good example of what can happen, even when the creator has good intentions.  So, will Microsoft’s attempts be more successful?  Can Microsoft build a bullet-proof worm-like mechanism for spreading patches?  And what about the legality aspect?

In order to spread patches this way, there needs to be some entry point into the remote computer system.  This means a server of some sort must be running on the remote computer.  Is this something we want every Windows machine on the planet running?  A single exploit puts us back into the same boat we’ve been in for a long time.  And Microsoft doesn’t have the best security track record.

Assuming for a moment, however, that Microsoft can develop some sort of secure server, how are the patches delivered?  Obviously a patch-worm is released, likely from Microsoft’s own servers, and spreads to other machines on the Internet.  But, many users have firewalls or NAT devices between themselves and the Internet.  Unless those devices are specifically configured to allow the traffic, the patch-worm will be stopped in it’s tracks.  Corporate firewalls would block this as well.  And what about the bandwidth required to download these patches?  Especially when we’re talking about big patches like service packs.

If the patch-worm somehow makes it to a remote computer, what validation is done to ensure it’s authenticity?  Certificates are useful, but they have been taken advantage of in the past.  If someone with malicious intent can hijack a valid session, there’s no telling what kind of damage can be done.

How will the user be notified about the patch?  Are we talking about auto-install?  Will warning boxes pop up?  What happens when the system needs to be rebooted?

And finally, what about the legal aspects of this?  Releasing worms on the Internet is illegal, and punishable with jail time.  But if that worm is “helpful”, then do the same rules apply?  Network traffic still increases, computer resources are used, and interruptions in service may occur as a result.

 

All I can say is this: This is *my* computer, keep your grubby mitts off it.

Vista… Take Two.

With Windows Vista shipping, Microsoft has turned it’s attention to the next version of Windows.  Currently known as Windows 7, there isn’t a lot of information about this latest iteration.  From the available information, however, it seems that Microsoft *might* be taking a slightly different direction with this version.

Most of the current talk about the next version of Windows has centered around a smaller, more compact kernel known as MinWin.  The kernel of any operating system is the lifeblood of the entire system.  The kernel is responsible for all of the communication between the software and the hardware.

The kernel is arguably the most important part of any operating system and, as such, has resulted in much research, as well as many arguments.  Today, there are two primary kernel types, the monolithic kernel, and the micro kernel.

With a monolithic kernel, all of the code to interface with the various hardware in the computer is built into the kernel.  It all runs in “kernel space,” a protected memory area designated solely to the kernel.  Properly built monolithic kernels can be extremely efficient.  However, bugs in any of the device drivers can cause the entire kernel to crash.  Linux is a good example of a very well built monolithic kernel.

A micro kernel, on the other hand, is a minimalist construct.  It includes only the necessary hooks to implement communication between the software and the hardware in kernel mode.  All other software is run in “user space,”  a separate memory area that can be swapped out to disk when necessary.  Drivers and other essential system software must “ask permission” to interact with the kernel.  In theory, buggy device drivers cannot cause the entire system to fail.  There is a price, however, that of the system call required to access the kernel.  As a result, micro kernels are considered slower than monolithic kernels.  MINIX is a good example of an OS with a micro kernel architecture.

The Windows NT line of operating systems, which includes XP and Vista, uses what Microsoft likes to call a “hybrid kernel.”  In theory, a hybrid kernel combines the best of both monolithic and micro kernels.  It’s supposed to have the speed of a monolithic kernel with the stability of a micro kernel.  I think the jury is still out on this, but it does seem that XP, at least, is much more stable than the Window 9x series of releases which used a monolithic kernel.

So what does all of this mean?  Well, Microsoft is attempting to optimize the core of the operating system, making it smaller, faster, and more efficient.  Current reports from Microsoft indicate that MinWin is functional and has a very small footprint.  The current iteration of MinWin occupies approximately 25 MB of disk space and memory usage of about 40 MB.  This is a considerable reduction in both drive and memory usage.  Keep in mind, however, that MinWin is still being developed and is missing many of the features necessary for it to be comparable with the current shipping kernel.

It seems that Microsoft is hyping this new kernel quite a bit at the moment, but watch for other features to be added as well.  It’s a pretty sure bet that the general theme will change, new flashy gadgets and graphical capabilities, and other such “fluff” will be added.  I’m not sure the market would respond very nicely to a new version of Windows without more flash and shiny…  Windows 7 is supposedly going to ship in 2010, but other reports have it shipping sometime in 2009.  If Vista is any indication, however, I wouldn’t expect Windows 7 until 2011 or 2012.

Meanwhile, it seems that Windows XP is still more popular than Vista.  In fact, it has been reported that InfoWorld has collected over 75,000 signatures on it’s “Save Windows XP” petition.  This is probably nothing more than a marketing stunt, but it does highlight the fact that Vista isn’t being adopted as quickly as Microsoft would like.  So, perhaps Microsoft will fast track Windows 7.  Only time will tell.

Vista

It’s been a while since Microsoft release their newest OS, Vista, and yet the complaints just haven’t stopped.  I just ran across this humorous piece about “upgrading” to Windows XP and decided it was time to write a little bit about Vista.

I can’t say I’m an expert by any means as I’ve only had limited experience with Vista at this point.  What experience I did have, however, was quite annoying and really turned me away from the thought of installing it.  Overall, Vista has an interesting look.  It’s not that bad, in reality, though it does seem to be a bit of overkill in the eye candy department.  It feels like Microsoft tried to make everything shiny and attractive, but ended up with a shiny, gaudy look instead.

My first experience with Vista involved setting up a Vista machine for network access.  Since setting up networking involves changing system settings, I was logged in as an administrator.  I popped open the control panel to set up the network adapter and spent the next 15 minutes messing around with the settings, prompted time and again to allow the changes I was making.  It was a frustrating experience, to say the least.  Something that takes me less than a minute to accomplish on a Windows XP machine, or even on a Linux machine, takes significantly longer on a Vista machine.

I also noticed a number of pauses, quite noticeable, as I manipulated files.  This happened on more than one machine, making me think there’s something wrong with the file subsystem in Vista.  I’ve heard it explained as a DRM mechanism, checking for various DRM schemes in an attempt to enforce them.  Either way, it’s slow and takes forever to accomplish simple copy and paste tasks.

One of my more recent experiences was an attempt to get Vista to recognize a RAZR phone.  I never did get that working, even with Motorola’s Vista compatible software.  I tried installing, uninstalling, and re-installing the software several times, rebooting in between, enduring the stupid security dialogs all the while.  Vista seems to have recognized the phone, but would not allow the user to interact with it.

They say that first impressions are the most important and, up to this point, Vista has not made a good impression on me at all.  If and when I do move to Vista, it will be with me kicking and screaming the entire way…