If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|
Thread Tools | Rate Thread | Display Modes |
#76
|
|||
|
|||
Google tracked his bike ride past a burglarized home. That madehim a suspect.
On 11/03/2020 23.45, anonlinuxuser wrote:
On 3/11/20 4:19 PM, nospam wrote: In article , Carlos E. R. wrote: Is it the Intel arch that is at fault for cyber attacks? no. Yes. no. yes. no. Certainly yes, you are showing your ignorance. Well, I don't know what actual malware has been using them, but Intel CPUS are plagued with "hardware" bugs that can be used for attacks, that have been discovered the last two years. there have been attacks long before x86 exploits were known, and android and ios too are not immune either, neither of which use intel. Not the same thing. Go illustrate yourself. i didn't say it was the same thing. the fact is that intel is not at fault for cyberattacks. I see that point.Â* But which is easier to exploit?Â* Intel or ARM? Side channel vulnerabilities on ARM? Dunno. Intel, several. And some of them, impossible to fully solve. https://en.wikipedia.org/wiki/Meltdown_(security_vulnerability) https://en.wikipedia.org/wiki/Spectre_(security_vulnerability) https://en.wikipedia.org/wiki/Intel_Management_Engine#Security_vulnerabilities https://en.wikipedia.org/wiki/Intel 2018–2020 security flaws https://en.wikipedia.org/wiki/Side-channel_attack -- Cheers, Carlos E.R. |
Ads |
#77
|
|||
|
|||
Google tracked his bike ride past a burglarized home. That madehim a suspect.
Carlos E. R. wrote:
On 11/03/2020 19.52, anonlinuxuser wrote: I've read the history on the original ARM in the UK, where they forgot to connect the power to the processor yet it ran, drawing the power from other chips. That part is impressive. That's a side effect that some electronics chips have. For example, the CMOS 4000 family. When that happened at work, even the LEDs lit up. The secret is, with phantom power like that, the DC voltage on the unpowered subsystem is lower. The "hot" part of the circuit might run at 5V, the "phantom" powered part runs at 3.7V or so. And on our system in the lab, that was sufficient to properly light some red indicator LEDs. The phantom circuit was fully functional. The LEDs were a bit "dull" which was annoying. You can't run a heavy load this way. Running a CPU railed would draw too much current and the rail would collapse. However, for low power circuits (4000 series CMOS being an example), with the nodes not doing a lot of toggling, phantom power works fine. Our subsystem at work (a drawer in a rack), it performed its function just fine with phantom power. Intel hardware has this issue too, and Intel demonstrates how to stop this effect. The Southbridge (PCH) houses the RTC and CMOS RAM. This sits in a logic island called the "CMOS well". To prevent phantom power leaking from the CR2032 coin cell powered well, into the general parts of the SB/PCH, they use transmission gates for all logic signal I/O. This stops phantom flow out of the well, and allows your CMOS CR2032 battery to last three years (if the PC is stored in the junk room). The transmission gate is a series pass element, and the control signal makes all of those go "open circuit", so the phantom power cannot flow out of one subsystem (the powered well), into the other (the unpowered SB). Back when we had the demo in the lab of this effect, there wasn't a proper transmission gate as such. There were 4016 chips at some point, which might have come close. Some version of these was termed "video switches". They have a bandwidth of about 1/10th of a modern one. A modern one has a 5 ohm ON resistance and 50pF stray capacitance, whereas this one is 280 ohms or so (as weak as ****). But it probably would have worked... if we'd thought of it :-/ Oops. http://www.ti.com/lit/ds/symlink/cd4016b.pdf Paul |
#78
|
|||
|
|||
Google tracked his bike ride past a burglarized home. That madehim a suspect.
anonlinuxuser wrote:
On 3/11/20 4:19 PM, nospam wrote: In article , Carlos E. R. wrote: Is it the Intel arch that is at fault for cyber attacks? no. Yes. no. yes. no. Well, I don't know what actual malware has been using them, but Intel CPUS are plagued with "hardware" bugs that can be used for attacks, that have been discovered the last two years. there have been attacks long before x86 exploits were known, and android and ios too are not immune either, neither of which use intel. Not the same thing. Go illustrate yourself. i didn't say it was the same thing. the fact is that intel is not at fault for cyberattacks. I see that point. But which is easier to exploit? Intel or ARM? Numerically, Intel has more of these issues than ARM. https://danielmiessler.com/blog/simp...tdown-spectre/ I don't know if there are any up-to-date tables for consumers, tracking this stuff. Initially there were attempts to provide tools for tracking, but that's a full time job for someone, and I doubt anyone has the time. Paul |
#79
|
|||
|
|||
Google tracked his bike ride past a burglarized home. That madehim a suspect.
On 12/03/2020 03.10, Paul wrote:
Carlos E. R. wrote: On 11/03/2020 19.52, anonlinuxuser wrote: I've read the history on the original ARM inÂ* the UK, where they forgot to connect the power to the processor yet it ran, drawing the power from other chips.Â* That part is impressive. That's a side effect that some electronics chips have. For example, the CMOS 4000 family. When that happened at work, even the LEDs lit up. The secret is, with phantom power like that, the DC voltage on the unpowered subsystem is lower. The "hot" part of the circuit might run at 5V, the "phantom" powered part runs at 3.7V or so. On the 4000 series, just 0.65 volts less - one diode drop. Diodes that were not designed for that and could of course burn with the load. Once I designed a circuit that could be connected in that mode, so I used external diodes, and resistors on the chip inputs. And on our system in the lab, that was sufficient to properly light some red indicator LEDs. The phantom circuit was fully functional. The LEDs were a bit "dull" which was annoying. You can't run a heavy load this way. Running a CPU railed would draw too much current and the rail would collapse. right However, for low power circuits (4000 series CMOS being an example), with the nodes not doing a lot of toggling, phantom power works fine. Our subsystem at work (a drawer in a rack), it performed its function just fine with phantom power. Intel hardware has this issue too, and Intel demonstrates how to stop this effect. The Southbridge (PCH) houses the RTC and CMOS RAM. This sits in a logic island called the "CMOS well". To prevent phantom power leaking from the CR2032 coin cell powered well, into the general parts of the SB/PCH, they use transmission gates for all logic signal I/O. This stops phantom flow out of the well, and allows your CMOS CR2032 battery to last three years (if the PC is stored in the junk room). The transmission gate is a series pass element, and the control signal makes all of those go "open circuit", so the phantom power cannot flow out of one subsystem (the powered well), into the other (the unpowered SB). Right. Back when we had the demo in the lab of this effect, there wasn't a proper transmission gate as such. There were 4016 chips at some point, which might have come close. Some version of these was termed "video switches". They have a bandwidth of about 1/10th of a modern one. A modern one has a 5 ohm ON resistance and 50pF stray capacitance, whereas this one is 280 ohms or so (as weak as ****). But it probably would have worked... if we'd thought of it :-/ Oops. http://www.ti.com/lit/ds/symlink/cd4016b.pdf Â*Â* Paul -- Cheers, Carlos E.R. |
#80
|
|||
|
|||
Google tracked his bike ride past a burglarized home. That made him a suspect.
In article , Carlos E. R.
wrote: the fact is that intel is not at fault for cyberattacks. I see that point.* But which is easier to exploit?* Intel or ARM? Side channel vulnerabilities on ARM? Dunno. Intel, several. And some of them, impossible to fully solve. that is true, however, cyberattacks exist without that. it's not intel's fault that there are cybercriminals. |
#81
|
|||
|
|||
Google tracked his bike ride past a burglarized home. That madehim a suspect.
On 12/03/2020 12.57, nospam wrote:
In article , Carlos E. R. wrote: the fact is that intel is not at fault for cyberattacks. I see that point.Â* But which is easier to exploit?Â* Intel or ARM? Side channel vulnerabilities on ARM? Dunno. Intel, several. And some of them, impossible to fully solve. that is true, however, cyberattacks exist without that. it's not intel's fault that there are cybercriminals. But it is Intel fault that their design has holes that can be used by cyberciminals. And some of those holes can not be fully plugged, and the plugs that exist slow down the computers notably, so much so that scientific computing has to disable those mitigations. I never said that Intel is at fault for all cyberattacks. But indeed they are guilty for these vulnerabilities. -- Cheers, Carlos E.R. |
#82
|
|||
|
|||
Google tracked his bike ride past a burglarized home. That made him a suspect.
In article , Carlos E. R.
wrote: the fact is that intel is not at fault for cyberattacks. I see that point.* But which is easier to exploit?* Intel or ARM? Side channel vulnerabilities on ARM? Dunno. Intel, several. And some of them, impossible to fully solve. that is true, however, cyberattacks exist without that. it's not intel's fault that there are cybercriminals. But it is Intel fault that their design has holes that can be used by cyberciminals. yes, but that's only a tiny part of cybercrime. by far, the biggest hole is the user, who can be tricked into installing something, who then blindly grants admin privs. no need for design errors in the hardware or exploits in the operating system. And some of those holes can not be fully plugged, and the plugs that exist slow down the computers notably, so much so that scientific computing has to disable those mitigations. they don't, since the major risk is with shared data on a system with multiple users, versus single user systems. I never said that Intel is at fault for all cyberattacks. nobody said you did. it was someone *else* who made that claim. But indeed they are guilty for these vulnerabilities. but only for those. |
#83
|
|||
|
|||
Google tracked his bike ride past a burglarized home. That madehim a suspect.
On 3/11/20 8:54 PM, Paul wrote:
anonlinuxuser wrote: On 3/11/20 4:19 PM, nospam wrote: In article , Carlos E. R. wrote: Is it the Intel arch that is at fault for cyber attacks? no. Yes. no. yes. no. Well, I don't know what actual malware has been using them, but Intel CPUS are plagued with "hardware" bugs that can be used for attacks, that have been discovered the last two years. there have been attacks long before x86 exploits were known, and android and ios too are not immune either, neither of which use intel. Not the same thing. Go illustrate yourself. i didn't say it was the same thing. the fact is that intel is not at fault for cyberattacks. I see that point.Â* But which is easier to exploit?Â* Intel or ARM? Numerically, Intel has more of these issues than ARM. https://danielmiessler.com/blog/simp...tdown-spectre/ I don't know if there are any up-to-date tables for consumers, tracking this stuff. Initially there were attempts to provide tools for tracking, but that's a full time job for someone, and I doubt anyone has the time. Â*Â* Paul That's interesting. I wonder why Microsoft made the Surface Pro X? One thing for certain is that it uses much less power. I also think they better hurry up and get visual studio working natively on the ARM as well, or it could be a short run for the X. |
#84
|
|||
|
|||
Google tracked his bike ride past a burglarized home. That made him a suspect.
anonlinuxuser wrote:
On 3/11/20 8:54 PM, Paul wrote: anonlinuxuser wrote: On 3/11/20 4:19 PM, nospam wrote: In article , Carlos E. R. wrote: Is it the Intel arch that is at fault for cyber attacks? no. Yes. no. yes. no. Well, I don't know what actual malware has been using them, but Intel CPUS are plagued with "hardware" bugs that can be used for attacks, that have been discovered the last two years. there have been attacks long before x86 exploits were known, and android and ios too are not immune either, neither of which use intel. Not the same thing. Go illustrate yourself. i didn't say it was the same thing. the fact is that intel is not at fault for cyberattacks. I see that point. But which is easier to exploit? Intel or ARM? Numerically, Intel has more of these issues than ARM. https://danielmiessler.com/blog/simp...tdown-spectre/ I don't know if there are any up-to-date tables for consumers, tracking this stuff. Initially there were attempts to provide tools for tracking, but that's a full time job for someone, and I doubt anyone has the time. Paul That's interesting. I wonder why Microsoft made the Surface Pro X? One thing for certain is that it uses much less power. I also think they better hurry up and get visual studio working natively on the ARM as well, or it could be a short run for the X. When Microsoft is not copying Apple, they're trying to develop "counter-moves" to battle Google and ChromeBooks. Copying Apple means making $1000 computers. That's Surface products. Battling with Google, involves making them pay licensing fees for Microsoft patents. That's to ensure that Microsoft makes some money off every Android sale. And the Win10 OEM OS on devices with small screens, is "free", again, as a way to attempt to undercut ChromeBooks. That's one of their ecosystem plays. That's where the $100 Black Friday tablets came from. At one time, somebody at Microsoft said they were going to simplify the SKUs of OS for sale. Now, there are at least a dozen of them, and I've lost count, and there are more OS versions now than at any time in the past. And so far, none of them are named "Windows 10 Ultimate" :-) Paul |
#85
|
|||
|
|||
Google tracked his bike ride past a burglarized home. That madehim a suspect.
In article nospam wrote: In article , Carlos E. R. wrote: the fact is that intel is not at fault for cyberattacks. I see that point.* But which is easier to exploit?* Intel or ARM? Side channel vulnerabilities on ARM? Dunno. Intel, several. And some of them, impossible to fully solve. that is true, however, cyberattacks exist without that. it's not intel's fault that there are cybercriminals. But it is Intel fault that their design has holes that can be used by cyberciminals. yes, but that's only a tiny part of cybercrime. by far, the biggest hole is the user, who can be tricked into installing something, who then blindly grants admin privs. no need for design errors in the hardware or exploits in the operating system. Democrats who are a hurry to get to their gay pr0n! And some of those holes can not be fully plugged, and the plugs that exist slow down the computers notably, so much so that scientific computing has to disable those mitigations. they don't, since the major risk is with shared data on a system with multiple users, versus single user systems. I never said that Intel is at fault for all cyberattacks. nobody said you did. it was someone *else* who made that claim. But indeed they are guilty for these vulnerabilities. but only for those. Intel has a history of sloppy design. They share blame. |
Thread Tools | |
Display Modes | Rate This Thread |
|
|