Sounds Good
Apr 5, 06:21 PM
Under the Apple menu on the top toolbar, you can access both recently used programs and recently used files just the same as in the Windows Start menu.
Ahh, good. Thanks. Are we able to put our "favorite" programs or files there too, like on the Windows Start menu? (even if they are not the most recently used?)
It's essentially the same thing, but better.
Why / how is it better?
Ahh, good. Thanks. Are we able to put our "favorite" programs or files there too, like on the Windows Start menu? (even if they are not the most recently used?)
It's essentially the same thing, but better.
Why / how is it better?
dashiel
Oct 7, 01:45 PM
Cause it's not. I played with the iPhone SDK for a test app and had to relearn a few things. For example, the + or - in front of a method, which means instance or class method (or vice-versa). I could find the right information (or Google keywords) to get it without a few bouts of swearing.
Then my company got a contract to port an iPhone app to Android. And by port I mean rewrite since we can't share anything from obj-c to Java.
Coming from a C/C++ background, the learning curve was really quick. Plus Google did a relatively good job with its SDK and emulator which work pretty well on both Mac and Windows.
hmm i've had the opposite experience. coming from an actionscript/javascript background i've been thoroughly impressed with the sdk in particular and obj-c in general. there's definitely a learning curve, but i suspect that would be true going to any real programming language.
Then my company got a contract to port an iPhone app to Android. And by port I mean rewrite since we can't share anything from obj-c to Java.
Coming from a C/C++ background, the learning curve was really quick. Plus Google did a relatively good job with its SDK and emulator which work pretty well on both Mac and Windows.
hmm i've had the opposite experience. coming from an actionscript/javascript background i've been thoroughly impressed with the sdk in particular and obj-c in general. there's definitely a learning curve, but i suspect that would be true going to any real programming language.
skunk
Mar 11, 03:55 PM
2149: The Kyodo news agency is now citing a safety panel as saying that the radiation level inside one of the reactors at the Fukushima-Daiichi nuclear plant is 1,000 times higher than normal.
http://www.bbc.co.uk/news/world-middle-east-12307698
Looking hairier by the minute. :eek:
http://www.bbc.co.uk/news/world-middle-east-12307698
Looking hairier by the minute. :eek:
gorgeousninja
Apr 9, 06:36 AM
Oh, and try to be more mature in your reply next time please. That was uncalled for and childish.
actually the post was funny and to the point, your coming across as arrogant and ill informed.
actually the post was funny and to the point, your coming across as arrogant and ill informed.
MacMiniOwner
Sep 12, 03:53 PM
I think the iTV is a fairly 'dumb' box that just drags media off you Mac on tou your TV...I've been doing this for years with a chipped xbox :)
aegisdesign
Sep 20, 05:57 AM
If Apple could include at least a DVD burner and ideally a DVR hard disk as well, then I could actually start replacing the other machines I have rather than just adding to them and cluttering up my living room.
Er, that's what your Mac is for.
All these calls for adding tuners, hard drives and burners are missing the point. Those functions belong in the host computer. iTV is just a method of getting the content from your Mac/PC to your stereo or TV.
In Microsoft terms, it's a media center extender, nothing more, albeit a pretty one.
If it's got a hard disk in it that's used for anything more than caching your iTunes Library file and thumbnails, I'd be very surprised.
Er, that's what your Mac is for.
All these calls for adding tuners, hard drives and burners are missing the point. Those functions belong in the host computer. iTV is just a method of getting the content from your Mac/PC to your stereo or TV.
In Microsoft terms, it's a media center extender, nothing more, albeit a pretty one.
If it's got a hard disk in it that's used for anything more than caching your iTunes Library file and thumbnails, I'd be very surprised.
ezekielrage_99
Sep 25, 11:32 PM
And the wait for 8 Core Mac Pros and Merom MacBook Pros/MaBook is on ;)
Waiting for speed bumps means no one buys a dang thing :cool:
Waiting for speed bumps means no one buys a dang thing :cool:
Edge100
Apr 15, 11:26 AM
Errr. Yes I do. :confused:
That's why I called him out on it.
He supressed the part that really matters.
Sorry, getting tough to keep track of who I'm quoting here. ;)
That's why I called him out on it.
He supressed the part that really matters.
Sorry, getting tough to keep track of who I'm quoting here. ;)
paul4339
Apr 21, 12:17 AM
It skews the number non the less. iOS is on four different devices the iTv, iPod touch, iphone, and the ipod touch jumbo. And google doesn't make any hardware. They work with companies to have them made like the nexus series.
The comScore data tracks the number of users ... so if you use four idevices, it's still counted as one user... at least that's what the article mentions.
The comScore data tracks the number of users ... so if you use four idevices, it's still counted as one user... at least that's what the article mentions.
firestarter
Mar 14, 06:21 PM
What coal-fired power station had the capability of endangering so many people?
Depends whether you believe in global warming. Should we be looking to expand our nuclear power capability, or revert to burning hydrocarbons?
James Lovelock described nuclear as 'the only green choice'.
Depends whether you believe in global warming. Should we be looking to expand our nuclear power capability, or revert to burning hydrocarbons?
James Lovelock described nuclear as 'the only green choice'.
Pants
Oct 9, 04:18 AM
Ive been using xp pro for 3 months here at work, and I have to say I'm quietly impressed. Its never crashed, nothing has unepectedly quit (and its running a bunch of custom pci cards, so if ever it was flakey, id have expected it to be so with this rig...). My only complaint is the 'look' of it - osX does look nicer, but then osX is a lot less snappy.
So where does my money go to with Apple? I posses a bunch of apples, and each time I buy a new one i feel a little less 'happy' and a little more like a regular consumer. After all, the days of non proprietory hardware being used in apples are gone - its all usb and firewire (and not even cutting edge usb at that). Some of my reasons for disliking M$ are also beginning to surface with appl� - .mac for a start. What osX has done is open my eyes to using linux at home (or maybe x86 solaris) ...switching? hmmm....
oh, and did anyone mention that apples floating point performance was good? no - its awful!
So where does my money go to with Apple? I posses a bunch of apples, and each time I buy a new one i feel a little less 'happy' and a little more like a regular consumer. After all, the days of non proprietory hardware being used in apples are gone - its all usb and firewire (and not even cutting edge usb at that). Some of my reasons for disliking M$ are also beginning to surface with appl� - .mac for a start. What osX has done is open my eyes to using linux at home (or maybe x86 solaris) ...switching? hmmm....
oh, and did anyone mention that apples floating point performance was good? no - its awful!
bedifferent
May 2, 04:51 PM
unbiased as opposed to a Mac site.... yeah right!
Mac users tend to be a better target for old fashioned phishing/vishing because...well, 'nothing bad happens on a Mac..' right?
Sure it can, but it's the percentage and the variables of these "bad" incidents that are key as you are generalizing without specifics.
How about unbiased studies, and percentages of viruses and malware between the two? Those would be facts (again, from an impartial party/experiment).
Also, you're on a Mac based website, so of course there are OS X defenders. Go to Engadget, et al if you don't wish to be here, you're free to decide :)
Mac users tend to be a better target for old fashioned phishing/vishing because...well, 'nothing bad happens on a Mac..' right?
Sure it can, but it's the percentage and the variables of these "bad" incidents that are key as you are generalizing without specifics.
How about unbiased studies, and percentages of viruses and malware between the two? Those would be facts (again, from an impartial party/experiment).
Also, you're on a Mac based website, so of course there are OS X defenders. Go to Engadget, et al if you don't wish to be here, you're free to decide :)
Dagless
Apr 13, 04:51 AM
Hmm, could be good. I've been using an old Final Cut that I bought whilst in university. Worked just fine for me but I only edit game trailers on it. If FCPX is that cheap in the UK too then I might just have to get it.
Maybe.
Apple have been doing a lot of simplifying of late and I hope it doesn't lose any depth.
Maybe.
Apple have been doing a lot of simplifying of late and I hope it doesn't lose any depth.
leftPCbehind209
Apr 12, 10:37 PM
From what i gathered, if it doesn't, at the very least it transcodes them in the background as you've imported them, so you can work on them straight away.
But it might actually work natively. It was strongly suggested a lot more files could be imported natively, DSLR was mentioned.
Thanks, I figured as much too. Big improvement from before.
Also, way too many haters here on iMovie. For weddings, it has been so much easier to skim my clips using iMovie than FC. I don't need a whole lot to put a wedding together...iMovie has been perfect...it just lacked majorly in color correction.
But it might actually work natively. It was strongly suggested a lot more files could be imported natively, DSLR was mentioned.
Thanks, I figured as much too. Big improvement from before.
Also, way too many haters here on iMovie. For weddings, it has been so much easier to skim my clips using iMovie than FC. I don't need a whole lot to put a wedding together...iMovie has been perfect...it just lacked majorly in color correction.
Aduntu
Apr 23, 02:44 PM
Genesis 1:13 And the evening and the morning were the third day
That phrasing occurs throughout the creation chapter in Genesis. It looks more than slightly unambiguous WRT the meaning of "day".
Genesis 1:5: "And god began calling the light day, but the darkness he called night." In that same verse, "there came to be evening, and there came be morning, a first day." In this single verse alone, "day" is used to define two different lengths of time. You can't conclude by the use of the word "day" in Genesis 1 that those days were strictly 24-hour periods.
That phrasing occurs throughout the creation chapter in Genesis. It looks more than slightly unambiguous WRT the meaning of "day".
Genesis 1:5: "And god began calling the light day, but the darkness he called night." In that same verse, "there came to be evening, and there came be morning, a first day." In this single verse alone, "day" is used to define two different lengths of time. You can't conclude by the use of the word "day" in Genesis 1 that those days were strictly 24-hour periods.
~Shard~
Oct 31, 09:02 AM
My quad was to ship today, after waiting four business days and two weekend days for a CTO build (2 GB RAM). But I would feel sick to have had the machine for a week when the Octo's are announced. I hope this baby makes Logic Pro sing...
I hope you don't have to wait too long... :o
I hope you don't have to wait too long... :o
samcraig
Mar 18, 12:10 PM
Perhaps, but it took them long enough to figure it out, or at least to take any action on it.
It's one thing to have that information, its another thing to access it and get a report on usage patterns that reliably determines that it us tethering usage. Internet usage can vary widely depending on the user. So it almost requires a human eye to look at it and make that determination. Even then, it can be a hard call.
There are a dozen and one ways they can use rules/logic engines - they don't need a human eye.
And the timing of this new policy isn't by accident nor has it taken ATT "long enough". It's strategic.
With 4.3 - mobile hotspots are now enabled on their network and there is a clear billing system set up within their infrastructure. Remember - prior to 4.3 - ANY tethering via the iPhone was against TOS.
Now that they have a specific plan they can switch you to and/or illustrate that you have LEGAL ways of tethering - they are in a much better position to win any of these so called "arguments."
It's no accident. They clearly have been poised to take action and waited until everything fell into place with the enabling of hotspots.
It's one thing to have that information, its another thing to access it and get a report on usage patterns that reliably determines that it us tethering usage. Internet usage can vary widely depending on the user. So it almost requires a human eye to look at it and make that determination. Even then, it can be a hard call.
There are a dozen and one ways they can use rules/logic engines - they don't need a human eye.
And the timing of this new policy isn't by accident nor has it taken ATT "long enough". It's strategic.
With 4.3 - mobile hotspots are now enabled on their network and there is a clear billing system set up within their infrastructure. Remember - prior to 4.3 - ANY tethering via the iPhone was against TOS.
Now that they have a specific plan they can switch you to and/or illustrate that you have LEGAL ways of tethering - they are in a much better position to win any of these so called "arguments."
It's no accident. They clearly have been poised to take action and waited until everything fell into place with the enabling of hotspots.
ChazUK
Feb 23, 02:02 PM
<snip>
Remember the end of 2006 when the Zune was announced and everyone was running around spazzing out about how dead Apple was and all the usual Microsoft cheerleaders in the tech press were practically wetting themselves in excitement? And a mere month later, what happened? The iPhone was unveiled and all but nullified the Zune.
I think anyone engaging in this kind of speculation should keep that in mind.
What could Apple possibly add to the iPhone which would equal the tech jump which nullified the Zune?
I can't see any phone manufacturer adding much more than is out there now. Faster CPU's, better radio tech, better network tech, better features (cam/storage etc) & updated software is about as far as it can go from here (from my limited vision).
If Apple ever did create such a generational leap as the Zune to iPhone leap this late in the game I would be heartily impressed with them! :cool:
Remember the end of 2006 when the Zune was announced and everyone was running around spazzing out about how dead Apple was and all the usual Microsoft cheerleaders in the tech press were practically wetting themselves in excitement? And a mere month later, what happened? The iPhone was unveiled and all but nullified the Zune.
I think anyone engaging in this kind of speculation should keep that in mind.
What could Apple possibly add to the iPhone which would equal the tech jump which nullified the Zune?
I can't see any phone manufacturer adding much more than is out there now. Faster CPU's, better radio tech, better network tech, better features (cam/storage etc) & updated software is about as far as it can go from here (from my limited vision).
If Apple ever did create such a generational leap as the Zune to iPhone leap this late in the game I would be heartily impressed with them! :cool:
Piggie
Apr 28, 01:20 PM
After reading much of this thread's replies, I can honestly say that MANY MR users are living in 2009. The tablet is a PC. Yeah, maybe it can't do 100% of what a MacPro can do, but it does 90% of it. You can use the iPad as a PC and do lots of productivity.
Sure, I wish it was a stronger machine, but it does word processing, it connects to the internet in different ways, it plays video, it plays music, it stores things, it can share things, it can compute, it is personal, it can do spread sheets, it can make movies, it can take photos, it can play games, it can do lots and lots and lots. Why wouldn't it be a PC? Because it doesn't render CGI films? Hell, it's close to having Photoshop already. Sure, it's no iMac, but an iMac is no MacPro.
If you aren't calling it a PC in you will in 2012 or 2013. Get used to it now, Technosaurus Rex'ers.
It would help the iPad, in the manner you are describing it, if, like an Android/Honeycomb tablet it was a machine in it's own right.
If you look at the way it works, and the way Apple have designed the OS, it's obvious that Apple do not see the iPad as an independent PC, and that Apple themselves see it, and have designed it to be just an extension of your "Real" personal computer.
We are having to rely on 3th party apps to get around Apple's official built in limitations for the device, It's linked totally to just one computer running iTunes, you can't even connect it to say your PC, your friends, PC and your works PC to upload and download data to and from the various machines.
The iPad, as designed, with Apples official software is made so that you set thing up and organise things on your PC or Mac, then you dock your iPad (your mobile extension of your PC) you do a few things, then you come back, re-dock the iPad and it get's backed up.
That's the device that Apple made and how they see it.
It's not the iPad's fault. It's how Apple have made it.
The fact that with some 3rd party apps you can extend it's functionality beyond how Apple see the device is neither here nor there.
Personally, I very VERY much hope Apple do allow the iPad to grow into a fully independent device and break it's lock down link to iTunes.
Unfortunately, seeing as the iTunes link is Apple's money making link, I cannot see them allowing this to happen for a long time, meaning it will never grow to it's full potential as a fully independent device.
Sure, I wish it was a stronger machine, but it does word processing, it connects to the internet in different ways, it plays video, it plays music, it stores things, it can share things, it can compute, it is personal, it can do spread sheets, it can make movies, it can take photos, it can play games, it can do lots and lots and lots. Why wouldn't it be a PC? Because it doesn't render CGI films? Hell, it's close to having Photoshop already. Sure, it's no iMac, but an iMac is no MacPro.
If you aren't calling it a PC in you will in 2012 or 2013. Get used to it now, Technosaurus Rex'ers.
It would help the iPad, in the manner you are describing it, if, like an Android/Honeycomb tablet it was a machine in it's own right.
If you look at the way it works, and the way Apple have designed the OS, it's obvious that Apple do not see the iPad as an independent PC, and that Apple themselves see it, and have designed it to be just an extension of your "Real" personal computer.
We are having to rely on 3th party apps to get around Apple's official built in limitations for the device, It's linked totally to just one computer running iTunes, you can't even connect it to say your PC, your friends, PC and your works PC to upload and download data to and from the various machines.
The iPad, as designed, with Apples official software is made so that you set thing up and organise things on your PC or Mac, then you dock your iPad (your mobile extension of your PC) you do a few things, then you come back, re-dock the iPad and it get's backed up.
That's the device that Apple made and how they see it.
It's not the iPad's fault. It's how Apple have made it.
The fact that with some 3rd party apps you can extend it's functionality beyond how Apple see the device is neither here nor there.
Personally, I very VERY much hope Apple do allow the iPad to grow into a fully independent device and break it's lock down link to iTunes.
Unfortunately, seeing as the iTunes link is Apple's money making link, I cannot see them allowing this to happen for a long time, meaning it will never grow to it's full potential as a fully independent device.
Hellhammer
Mar 13, 10:29 AM
a japanese meterology institute estimates the chances of 7.0+ earthquake within the next 3 days at 70% so we will see how well they hold up
I'm still waiting for the other Icelandic volcano to burst, which is supposed to be much bigger than the one which caused global chaos. All those experts said it will happen "very soon" after the first one but we are still waiting.
I'm still waiting for the other Icelandic volcano to burst, which is supposed to be much bigger than the one which caused global chaos. All those experts said it will happen "very soon" after the first one but we are still waiting.
Darth.Titan
Oct 7, 11:45 AM
Of course Android might surpass the iPhone. The iPhone is limited to 1 device whereas the Android is spanned over many more devices and will continue to branch out.
You, sir have hit the nail on the head.
You, sir have hit the nail on the head.
therevolution
Mar 18, 05:08 PM
Sorry, i didn't read every post so this may be repeatative but...
why would you pay for something you don't want
To prove a point: DRM is basically useless.
why would you pay for something you don't want
To prove a point: DRM is basically useless.
KnightWRX
May 2, 05:51 PM
Until Vista and Win 7, it was effectively impossible to run a Windows NT system as anything but Administrator. To the point that other than locked-down corporate sites where an IT Professional was required to install the Corporate Approved version of any software you need to do your job, I never knew anyone running XP (or 2k, or for that matter NT 3.x) who in a day-to-day fashion used a Standard user account.
Of course, I don't know of any Linux distribution that doesn't require root to install system wide software either. Kind of negates your point there...
In contrast, an "Administrator" account on OS X was in reality a limited user account, just with some system-level privileges like being able to install apps that other people could run. A "Standard" user account was far more usable on OS X than the equivalent on Windows, because "Standard" users could install software into their user sandbox, etc. Still, most people I know run OS X as Administrator.
You could do the same as far back as Windows NT 3.1 in 1993. The fact that most software vendors wrote their applications for the non-secure DOS based versions of Windows is moot, that is not a problem of the OS's security model, it is a problem of the Application. This is not "Unix security" being better, it's "Software vendors for Windows" being dumber.
It's no different than if instead of writing my preferences to $HOME/.myapp/ I'd write a software that required writing everything to /usr/share/myapp/username/. That would require root in any decent Unix installation, or it would require me to set permissions on that folder to 775 and make all users of myapp part of the owning group. Or I could just go the lazy route, make the binary 4755 and set mount opts to suid on the filesystem where this binary resides... (ugh...).
This is no different on Windows NT based architectures. If you were so inclined, with tools like Filemon and Regmon, you could granularly set permissions in a way to install these misbehaving software so that they would work for regular users.
I know I did many times in a past life (back when I was sort of forced to do Windows systems administration... ugh... Windows NT 4.0 Terminal Server edition... what a wreck...).
Let's face it, Windows NT and Unix systems have very similar security models (in fact, Windows NT has superior ACL support out of the box, akin to Novell's close to perfect ACLs, Unix is far more limited with it's read/write/execute permission scheme, even with Posix ACLs in place). It's the hoops that software vendors outside the control of Microsoft made you go through that forced lazy users to run as Administrator all the time and gave Microsoft such headaches.
As far back as I remember (when I did some Windows systems programming), Microsoft was already advising to use the user's home folder/the user's registry hive for preferences and to never write to system locations.
The real differenc, though, is that an NT Administrator was really equivalent to the Unix root account. An OS X Administrator was a Unix non-root user with 'admin' group access. You could not start up the UI as the 'root' user (and the 'root' account was disabled by default).
Actually, the Administrator account (much less a standard user in the Administrators group) is not a root level account at all.
Notice how a root account on Unix can do everything, just by virtue of its 0 uid. It can write/delete/read files from filesystems it does not even have permissions on. It can kill any system process, no matter the owner.
Administrator on Windows NT is far more limited. Don't ever break your ACLs or don't try to kill processes owned by "System". SysInternals provided tools that let you do it, but Microsoft did not.
All that having been said, UAC has really evened the bar for Windows Vista and 7 (moreso in 7 after the usability tweaks Microsoft put in to stop people from disabling it). I see no functional security difference between the OS X authorization scheme and the Windows UAC scheme.
UAC is simply a gui front-end to the runas command. Heck, shift-right-click already had the "Run As" option. It's a glorified sudo. It uses RDP (since Vista, user sessions are really local RDP sessions) to prevent being able to "fake it", by showing up on the "console" session while the user's display resides on a RDP session.
There, you did it, you made me go on a defensive rant for Microsoft. I hate you now.
My response, why bother worrying about this when the attacker can do the same thing via shellcode generated in the background by exploiting a running process so the the user is unaware that code is being executed on the system
Because this required no particular exploit or vulnerability. A simple Javascript auto-download and Safari auto-opening an archive and running code.
Why bother, you're not "getting it". The only reason the user is aware of MACDefender is because it runs a GUI based installer. If the executable had had 0 GUI code and just run stuff in the background, you would have never known until you couldn't find your files or some chinese guy was buying goods with your CC info, fished right out of your "Bank stuff.xls" file.
That's the thing, infecting a computer at the system level is fine if you want to build a DoS botnet or something (and even then, you don't really need privilege escalation for that, just set login items for the current user, and run off a non-privilege port, root privileges are not required for ICMP access, only raw sockets).
These days, malware authors and users are much more interested in your data than your system. That's where the money is. Identity theft, phishing, they mean big bucks.
Of course, I don't know of any Linux distribution that doesn't require root to install system wide software either. Kind of negates your point there...
In contrast, an "Administrator" account on OS X was in reality a limited user account, just with some system-level privileges like being able to install apps that other people could run. A "Standard" user account was far more usable on OS X than the equivalent on Windows, because "Standard" users could install software into their user sandbox, etc. Still, most people I know run OS X as Administrator.
You could do the same as far back as Windows NT 3.1 in 1993. The fact that most software vendors wrote their applications for the non-secure DOS based versions of Windows is moot, that is not a problem of the OS's security model, it is a problem of the Application. This is not "Unix security" being better, it's "Software vendors for Windows" being dumber.
It's no different than if instead of writing my preferences to $HOME/.myapp/ I'd write a software that required writing everything to /usr/share/myapp/username/. That would require root in any decent Unix installation, or it would require me to set permissions on that folder to 775 and make all users of myapp part of the owning group. Or I could just go the lazy route, make the binary 4755 and set mount opts to suid on the filesystem where this binary resides... (ugh...).
This is no different on Windows NT based architectures. If you were so inclined, with tools like Filemon and Regmon, you could granularly set permissions in a way to install these misbehaving software so that they would work for regular users.
I know I did many times in a past life (back when I was sort of forced to do Windows systems administration... ugh... Windows NT 4.0 Terminal Server edition... what a wreck...).
Let's face it, Windows NT and Unix systems have very similar security models (in fact, Windows NT has superior ACL support out of the box, akin to Novell's close to perfect ACLs, Unix is far more limited with it's read/write/execute permission scheme, even with Posix ACLs in place). It's the hoops that software vendors outside the control of Microsoft made you go through that forced lazy users to run as Administrator all the time and gave Microsoft such headaches.
As far back as I remember (when I did some Windows systems programming), Microsoft was already advising to use the user's home folder/the user's registry hive for preferences and to never write to system locations.
The real differenc, though, is that an NT Administrator was really equivalent to the Unix root account. An OS X Administrator was a Unix non-root user with 'admin' group access. You could not start up the UI as the 'root' user (and the 'root' account was disabled by default).
Actually, the Administrator account (much less a standard user in the Administrators group) is not a root level account at all.
Notice how a root account on Unix can do everything, just by virtue of its 0 uid. It can write/delete/read files from filesystems it does not even have permissions on. It can kill any system process, no matter the owner.
Administrator on Windows NT is far more limited. Don't ever break your ACLs or don't try to kill processes owned by "System". SysInternals provided tools that let you do it, but Microsoft did not.
All that having been said, UAC has really evened the bar for Windows Vista and 7 (moreso in 7 after the usability tweaks Microsoft put in to stop people from disabling it). I see no functional security difference between the OS X authorization scheme and the Windows UAC scheme.
UAC is simply a gui front-end to the runas command. Heck, shift-right-click already had the "Run As" option. It's a glorified sudo. It uses RDP (since Vista, user sessions are really local RDP sessions) to prevent being able to "fake it", by showing up on the "console" session while the user's display resides on a RDP session.
There, you did it, you made me go on a defensive rant for Microsoft. I hate you now.
My response, why bother worrying about this when the attacker can do the same thing via shellcode generated in the background by exploiting a running process so the the user is unaware that code is being executed on the system
Because this required no particular exploit or vulnerability. A simple Javascript auto-download and Safari auto-opening an archive and running code.
Why bother, you're not "getting it". The only reason the user is aware of MACDefender is because it runs a GUI based installer. If the executable had had 0 GUI code and just run stuff in the background, you would have never known until you couldn't find your files or some chinese guy was buying goods with your CC info, fished right out of your "Bank stuff.xls" file.
That's the thing, infecting a computer at the system level is fine if you want to build a DoS botnet or something (and even then, you don't really need privilege escalation for that, just set login items for the current user, and run off a non-privilege port, root privileges are not required for ICMP access, only raw sockets).
These days, malware authors and users are much more interested in your data than your system. That's where the money is. Identity theft, phishing, they mean big bucks.
Therbo
May 2, 09:41 AM
Please, enlighten us how "Unix Security" is protecting you here, more than it would on Windows ? I'd be delighted to hear your explanation.
A lot of people trumpet "Unix Security" without even understanding what it means.
The Unix Permission system, how a virus on Windows can just access your system and non-owned files, where Unix/Linux dosen't like that.
But of course it dosen't protect agaisn't bad passwords or stupidity.
A lot of people trumpet "Unix Security" without even understanding what it means.
The Unix Permission system, how a virus on Windows can just access your system and non-owned files, where Unix/Linux dosen't like that.
But of course it dosen't protect agaisn't bad passwords or stupidity.
0 comments:
Post a Comment