Microsoft will try to put a possible hardware backdoor in future processors


Linux-SWAT

Hardcore Member
Joined
Feb 13, 2010
Messages
8,573

PowerGod

Hardcore Member
Joined
Jun 20, 2011
Messages
3,693
Wasn't that told from before 2000 when they planned the "Palladium project" ?
 
  • Like
Reactions: rSl

Swordfish II

Advanced Member
Joined
May 20, 2015
Messages
1,102
It's from an anonymous reader... The first sentence

An anonymous reader shares a report:
Doesn't lend itself to credibility.

Also they already have hardware backdoors, so nothing new there
 

Swordfish II

Advanced Member
Joined
May 20, 2015
Messages
1,102
It's been decades that Slashdot write "anonymous reader" when someone points out an article...
Ah found the actual source article.

Maybe I'm missing something but how is this any better/worse than the current hardware backdoors?
 

ElPoco

Very Active Member
Joined
Feb 16, 2012
Messages
870
Age
36
Location
Paris, France
It all depends on how this is implemented.
If it's just an isolated chip with limited I/O and a non-reprogrammable open-source software that can only handle a few operations (mostly crypto/signature stuff), that can indeed improve security.

Granted, "secure chips" aren't secure. Flaws have been discovered in most existing ones, and either you make them patch-able and run the risk of having an attacker abuse the patching mechanism, or you make them un-patcheable and run the risk of having a non-patchable flaw.
But they're still better than having nothing at all. They've made breaking into smartphones far more difficult than it used to be.
 

levi

Still fresh, damnit!
Joined
Oct 6, 2008
Messages
13,166
Location
Somewhere off the coast of the EU
I'm not sure any kind of modern chip that runs firmware should be unupgradable. Even the microcode in my intel CPU these days gets updated with all of my OS upgrades. But yeah, if you allow attackers to run code on your CPU then there's a risk in that, but who with any sense would allow that?
 

TeDaDeS

Advanced Member
Joined
Jan 15, 2004
Messages
1,082
Location
The Netherlands
Website
Visit site
I'm not sure any kind of modern chip that runs firmware should be unupgradable. Even the microcode in my intel CPU these days gets updated with all of my OS upgrades. But yeah, if you allow attackers to run code on your CPU then there's a risk in that, but who with any sense would allow that?
I assume most hacks are due to stuff you wouldn't allow.
 

elvissteinjr

Very Active Member
Joined
Jun 19, 2010
Messages
711
Age
24
Location
Germany
But they're still better than having nothing at all. They've made breaking into smartphones far more difficult than it used to be.
Funnily enough that's exactly where they bug me the most. What they did to me is take the freedom to do what I want with my device. Even if the phone-manufacturer allows it, Google bites back teamed up with the SoC manufacturer to ensure exercising my freedom still locks me out of certain software (everything using SafetyNet, no matter how actual security-relevant the app in question really is). Workarounds still exist, for now, but they have all the tools to lock it down 100%.
 

TeDaDeS

Advanced Member
Joined
Jan 15, 2004
Messages
1,082
Location
The Netherlands
Website
Visit site
Funnily enough that's exactly where they bug me the most. What they did to me is take the freedom to do what I want with my device. Even if the phone-manufacturer allows it, Google bites back teamed up with the SoC manufacturer to ensure exercising my freedom still locks me out of certain software (everything using SafetyNet, no matter how actual security-relevant the app in question really is). Workarounds still exist, for now, but they have all the tools to lock it down 100%.
It also creates the situation where it's easier for companies to use 'cheap' exploits in their own software while it's extremely hard for others to defend against them.
For example a backdoor in some hardware that makes your equipment easily accessible to government agencies. People would state that it's good only government an access those as it's for your own good.
But those exploits can also be done by others and then you wouldn't be able to detect or prevent them.

I would state that it's probably better to have an equal playing field, where you can detect and prevent exploits as easily as they can be exploited.
Which usually leads to open-source hardware and software; and those could potentially be supported as long as people want.
In practice you'll see that's much harder as the user base for some products is to small and they might not have the knowledge to support this themselves.
So personally I think it's good if open-source is also done commercially, so you can rely on support and have new companies start from other support questions.
 

levi

Still fresh, damnit!
Joined
Oct 6, 2008
Messages
13,166
Location
Somewhere off the coast of the EU
For example a backdoor in some hardware that makes your equipment easily accessible to government agencies. People would state that it's good only government an access those as it's for your own good.
That depends whose government we're talking about. If my local government is able to hack my software, odds are on that the Israliis, the US, the Chinese, the Russians and possibly North Korea also would find it easy to hack the software. Not many people would be happy if all of those countries could get our secrets.
 
Top