APLawrence.com -  Resources for Unix and Linux Systems, Bloggers and the self-employed

Got root?

© November 2008 Anthony Lawrence

Mac, Linux, BSD open for attack: Kaspersky is yet another in a long running series of predictions that we non-Microsoft folk are in for a rude awakening as soon as those bad-boy virus writers notice us.

In spite of BSD and Linux owning the market when it comes to web servers, Linux tearing up the data centers for back end servers of all types, making strong moves in the laptop arena and even gaining significant market share in the general user market, that just hasn't happened, yet.

I had to say "yet" because there's some unwritten law that says you have to imply impending ruination when mentioning this subject, even if you are stating facts that would seem not to support this dire future. To wit, notice the closing word in this title: OSX Malware not taking off yet (www.avertlabs.com link dead, sorry)

Today we know of over 236,000 malicious malware items. These are mostly meant for the MS-Windows environment. Only about 700 are meant for the various Unix/Linux distributions. Current known Mac OSX malware count is even less with 7, so pretty much non-existent at the moment. For older builds of the MacOS there are 69 known malicious items, with an additional 8 items for MacHC that used hypercard script extensions which had to be manually installed as an addon package.

Apparently this untenable situation has now existed for long enough that it is actually causing distress in some quarters: Mac.Blorge reports Experts baffled by lack of malware.

That piece points out that Mac systems have now reached over 8 percent of overall market share. And "yet".. the uncooperative malware writers just aren't doing what they are supposed to do.

Or maybe they are just having a harder time doing it.

To be at marginally useful, malware needs administrative access. Without "root" (or its Windows equivalent, "Administrative user"), attacks are limited. Not impossible, of course, but harder and less effective. If you have to escalate privilege, that's one more lock to break. If you already have that privilege, you don't have to attain it. So who has "root"?

Well, just about every single XP machine outside of corporate environments is usually logged in as someone with admin privileges. Even in small businesses, that's the default situation because XP is just too annoying to use any other way. Ford Motor Company surely locks down its XP machines; Joe's Bar and Grille probably does not.

Unix machines? In corporate environments, "root" logins almost never exist. Sudo with limited privileges in the larger businesses, "root" for shutdown only even in small systems, I have rarely, rarely seen common use of "root" in business Unix systems.

Home Linux systems? Sometimes unsophisticated users do run as root. I've seen that, and I've also seen Linux systems get hacked with ordinary accounts using weak passwords. Home users can be sloppy. If Linux has a weakness, there it is: naive new users.

Mac OS X? Almost never is anyone logged in as root. Most OS X systems don't even have a root password set; they operate only with sudo. The majority of Mac users likely don't even know that "root" exists, period.

So, the correlation betwewen likely root users and malware counts seems pretty strong, doesn't it?

Of course correlation doesn't always mean anything. Linux machines are often of higher value than OS X machines (more apt to be servers holding potentially valuable data), so that could explain why Linux gets attacked more than OS X. If Apple ever makes progress in the server market, things could change. However, we have to remember that Apple has a BSD base and BSD has a long standing reputation for better security. BSD has always been strong as web servers, so the attack value is there, and yet ("yet" again!) the reputation stands.

But we can't forget that hacking challenge where OS X was the first to succumb. Of course that wasn't like firing bullets against targets to see which fails first; the attackers made a conscious decision to concentrate their efforts on OS X first. That was probably because they knew something that made them think their method might be successful, and of course it was. My point here is just that although only a handful of threats exist now, a concentrated effort would likely turn up more.

Why isn't that effort being made? Probably the "lack of value" - OS X machines are usually not storing valuable data. That still leaves value as 'bots, though, but then we get back to the ease of compromise issues: Window's PC's are going to be easier to take over and the "herd" will be homogenous, with no need for different control code. The bot herder doesn't see Mac's as rich hunting ground and any such machines in a herd would need special handling - probably not worth the effort.

So which is it? The market penetration argument, the probable value explanation, or the "got root" stats? Or maybe all that?

Got something to add? Send me email.

(OLDER)    <- More Stuff -> (NEWER)    (NEWEST)   

Printer Friendly Version

-> Root powers responsible for malware counts?


Inexpensive and informative Apple related e-books:

Take Control of Upgrading to El Capitan

Take Control of Apple Mail, Third Edition

Take control of Apple TV, Second Edition

Take Control of High Sierra

Take Control of Parallels Desktop 12

More Articles by © Anthony Lawrence

Mon Nov 10 16:11:07 2008: 4755   jtimberman

I was amused when I started using my new Macbook when I opened a program I had downloaded and was warned "This program was downloaded from the internet. Cancel or Open?" and of course the password prompt that followed when I said Open and proceeded to install in /Applications.

That reminds me an awful lot of Vista's UAC. Of course, Mac OS uses sudo, though installing in /Applications shouldn't require that since I'm in the admin group that owns the directory. What amuses me though is the open prompt, because Apple had the anti-UAC commercials last year.

As for bot herders, they will go after Linux, BSD, Solaris and Mac servers to get to the userland desktop systems behind them. Generally they're trying to bruteforce SSH account passwords (all the more reason to disable passwords and use SSH keys) so they can set up port scanners and irc bots to collect information about targets that the compromised server can access, usually on private 10. and 192.168. addresses. Once they can identify additional targets, they launch the bot attacks to takeover the XP systems which might have unauthenticated connections to valuable server systems through Active Directory, etc.

Mon Nov 10 16:20:06 2008: 4757   TonyLawrence

Yeah, I realize that some apps may need to modify system files, but it always makes me a little suspicious when the installer needs sudo. Probably just ignorance in most cases.

Tue Nov 11 13:08:35 2008: 4766   badanov

Installing even perl modules for postgresql requires every step but the actual installation of the file to be done as a user not root.

It's funny hearing about brute force attacks to try to break in to get Windows machines:

Why attack open ssh ports on a datacenter if your final intended target is a Windows desktop? The way I understand it, most businesses run their setups through a broad DSL pipe or through a T1. If bots are trying to pass through a *nix machine why go after an unlikely target that resides in a datacenter? Why not target an actual pipe a business use?

Tue Nov 11 16:09:34 2008: 4769   jtimberman

Well, by breaking the perimeter of the data center they can access more powerful systems (Unix, Linux or Windows) inside. Windows systems in the past have been less secure and easily compromised (see also Exchange worms). The bot herders probably aren't after data center systems for Windows desktops, but that can also lead them over to userland via insecured VPN and direct links from office networks.

These reasons are why it is important to filter outbound connections at the firewall and not just let anyone go to any site they want. It creates more administrative hassle, but it also reduces risk.


Printer Friendly Version

Have you tried Searching this site?

This is a Unix/Linux resource website. It contains technical articles about Unix, Linux and general computing related subjects, opinion, news, help files, how-to's, tutorials and more.

Contact us

Printer Friendly Version

Everyone knows that debugging is twice as hard as writing a program in the first place. So if you're as clever as you can be when you write it, how will you ever debug it? (Brian Kernighan)

Linux posts

Troubleshooting posts

This post tagged:



Unix/Linux Consultants

Skills Tests

Unix/Linux Book Reviews

My Unix/Linux Troubleshooting Book

This site runs on Linode