If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|
Thread Tools | Rate Thread | Display Modes |
#16
|
|||
|
|||
Win95
mechanic wrote:
25 years on, we are reminded of the big changes Win95 showed compared to the preceding Win3.1 (I'm talking of home/consumer versions). 25 years is a long time - what is it, 12-15 Moore periods? Shouldn't we expect rather more change in the hardware as a result? Software has come a long way, we have AI and virtual reality on the horizon, but the hardware seems to have stagnated. You're hard to please. CRT monitors were the norm at 800x600 resolution. We have hugely powerful multicore CPUs, incredibly fast solid state hard drives with volumes that were science fiction in the 90s, and GPU technology is phenomenally powerful. This has all enabled high quality and high resolution digital photography and film production, advances in science not previously possible, and data analytics. The basic vision behind 'one computer for each home/desk' hasn't changed. Wrong. Cloud computing, virtual machines, and docker-like software are all blurring what a computer on your desk can do. From my desk I have access to almost limitless computing power. You're also ignoring all the other computers you may own: a phone, a watch, TVs, gaming consoles, smart speakers, heating controller, your car, etc, etc. Cost is also massively reduced. Just look at the raspberry pi. If you showed a microsoftie from 1995 what we have today you'd blow their mind. |
Ads |
#17
|
|||
|
|||
Win95
On Tue, 25 Aug 2020 14:28:46 -0400, dale wrote:
On 8/25/2020 6:29 AM, mechanic wrote: 25 years on, we are reminded of the big changes Win95 showed compared to the preceding Win3.1 (I'm talking of home/consumer versions). 25 years is a long time - what is it, 12-15 Moore periods? Shouldn't we expect rather more change in the hardware as a result? Software has come a long way, we have AI and virtual reality on the horizon, but the hardware seems to have stagnated. The basic vision behind 'one computer for each home/desk' hasn't changed. won't 128bit computing have to be quantum? Yes, quantum computers will be a big jump, on the network the move from copper to optic fibre was a big jump, most other improvements are just engineering tuneups of known technologies, driven by Moore's Law. Humans still type at their keyboards though. |
#18
|
|||
|
|||
Win95
On Wed, 26 Aug 2020 07:16:10 -0000 (UTC), Chris wrote:
Wrong. Cloud computing, virtual machines, and docker-like software are all blurring what a computer on your desk can do. From my desk I have access to almost limitless computing power. Yes, and there is a trend to more mobile access too. Although most of that is incremental improvement to concepts familiar in 1995. Nowadays, with this incredible computing power, we can't even produce a system of candidate ratings without the exams[1] familiar to earlier generations without introducing unacceptable biases. [1] See many articles in the Guardian on the (UK) A and GCSE level exams in schools, and their replacement due to Covid. |
#19
|
|||
|
|||
Win95
On 2020-08-26 09:35:19 +0000, mechanic said:
On Wed, 26 Aug 2020 07:16:10 -0000 (UTC), Chris wrote: Wrong. Cloud computing, virtual machines, and docker-like software are all blurring what a computer on your desk can do. From my desk I have access to almost limitless computing power. Yes, and there is a trend to more mobile access too. Although most of that is incremental improvement to concepts familiar in 1995. It sounds like you're not familiar with how different the concepts are and how the hardware has enabled them. Nowadays, with this incredible computing power, we can't even produce a system of candidate ratings without the exams[1] familiar to earlier generations without introducing unacceptable biases. That has nothing to do with computers. A computer will only do what it is programmed/told to do. The process was broken from the start and was ****ed-up by the people "in charge" - an oxymoron. They could (and should) have done better: https://www.theguardian.com/educatio...-algorithm-row [1] See many articles in the Guardian on the (UK) A and GCSE level exams in schools, and their replacement due to Covid. I'm fully aware as I have kids involved in this plus many of my friends do too. |
#20
|
|||
|
|||
Win95
On 8/25/2020 6:29 AM, mechanic wrote: 25 years on, we are reminded of the big changes Win95 showed compared to the preceding Win3.1 (I'm talking of home/consumer versions). 25 years is a long time - what is it, 12-15 Moore periods? Shouldn't we expect rather more change in the hardware as a result? Software has come a long way, we have AI and virtual reality on the horizon, but the hardware seems to have stagnated. The basic vision behind 'one computer for each home/desk' hasn't changed. For the longest time, and still, I would say the software has lagged the hardware. Still, we have ever more cores, and not so much parallel processing... That's said, I'm disappointed that we seem to have reached peak CPU speeds at 4-4.5 GHz--they have been there for at least 5 years. Now they are just stacking "cores" like apartments. : ) |
#21
|
|||
|
|||
Win95
On 8/26/2020 12:16 AM, Chris wrote:
mechanic wrote: 25 years on, we are reminded of the big changes Win95 showed compared to the preceding Win3.1 (I'm talking of home/consumer versions). 25 years is a long time - what is it, 12-15 Moore periods? Shouldn't we expect rather more change in the hardware as a result? Software has come a long way, we have AI and virtual reality on the horizon, but the hardware seems to have stagnated. You're hard to please. CRT monitors were the norm at 800x600 resolution. Not initially. Back when I got my first personal computer in 1987, the norm was 640x480. 800x600 came later. Today, of course, it's much higher. I run my two 24" LCD monitors at 1920x1080, and many people run at much higher resolutions. -- Ken |
#22
|
|||
|
|||
Win95
On 26 Aug 2020 at 16:10:41 BST, "Ken Blake" wrote:
On 8/26/2020 12:16 AM, Chris wrote: mechanic wrote: 25 years on, we are reminded of the big changes Win95 showed compared to the preceding Win3.1 (I'm talking of home/consumer versions). 25 years is a long time - what is it, 12-15 Moore periods? Shouldn't we expect rather more change in the hardware as a result? Software has come a long way, we have AI and virtual reality on the horizon, but the hardware seems to have stagnated. You're hard to please. CRT monitors were the norm at 800x600 resolution. Not initially. Back when I got my first personal computer in 1987, the norm was 640x480. 800x600 came later. We're talking about win95 era computers. In 1995/96 800x600 was common and 1024x768 wasn't unusual. Today, of course, it's much higher. I run my two 24" LCD monitors at 1920x1080, and many people run at much higher resolutions. I wouldn't use anything less than 2560x1440 nowadays. |
#23
|
|||
|
|||
Win95
On Wed, 26 Aug 2020 16:06:42 -0000 (UTC), Chris wrote:
I wouldn't use anything less than 2560x1440 nowadays. Not much of a jump from 800x600 in 25 years! |
#24
|
|||
|
|||
Win95
On 8/26/2020 9:06 AM, Chris wrote:
On 26 Aug 2020 at 16:10:41 BST, "Ken Blake" wrote: On 8/26/2020 12:16 AM, Chris wrote: mechanic wrote: 25 years on, we are reminded of the big changes Win95 showed compared to the preceding Win3.1 (I'm talking of home/consumer versions). 25 years is a long time - what is it, 12-15 Moore periods? Shouldn't we expect rather more change in the hardware as a result? Software has come a long way, we have AI and virtual reality on the horizon, but the hardware seems to have stagnated. You're hard to please. CRT monitors were the norm at 800x600 resolution. Not initially. Back when I got my first personal computer in 1987, the norm was 640x480. 800x600 came later. We're talking about win95 era computers. In 1995/96 800x600 was common and 1024x768 wasn't unusual. Today, of course, it's much higher. I run my two 24" LCD monitors at 1920x1080, and many people run at much higher resolutions. I wouldn't use anything less than 2560x1440 nowadays. Then you must have a very big monitor. What's best for you isn't best for everyone. I can't imagine anyone with a 15" monitor being happy with that resolution. -- Ken |
#25
|
|||
|
|||
Win95
On 26 Aug 2020 at 19:53:27 BST, "mechanic" wrote:
On Wed, 26 Aug 2020 16:06:42 -0000 (UTC), Chris wrote: I wouldn't use anything less than 2560x1440 nowadays. Not much of a jump from 800x600 in 25 years! That's an 8-fold increase in area whilst shrinking the size, weight, energy usage and improving the quality (analogue vs digital) of the monitor. I can't imagine what kind of desk I'd need for a 27" CRT monitor. If we look at bandwidth the difference is even larger. 8-bit colour was common for 800x600 which is 3.8m bits. We can easily do 24bits @ 2560x1440 = 88.5m bits, nowadays. Nearly 25x larger. At 60Hz refresh that's 5.3Gbps. Try pushing that with 25-yo hardware. |
#26
|
|||
|
|||
Win95
"Chris" wrote
| I wouldn't use anything less than 2560x1440 nowadays. | | Not much of a jump from 800x600 in 25 years! | | That's an 8-fold increase in area whilst shrinking the size, weight, energy | usage and improving the quality (analogue vs digital) of the monitor. I can't | imagine what kind of desk I'd need for a 27" CRT monitor. | I'm using a 27" monitor, at 1920x1080. More pixels doesn't mean more advanced technology. It's about the human body using the hardware. My eyes are getting old. I set up this monitor on a cabinet drawer slide, mounted under bookshelves over my desk. So I can pull it out or push it back. Whatever I find comfortable. It's 24-bit color and it's clear enough. There might be scenarios where more concentrated pixels are useful, but that would mostly be on tiny screens, like phones. Why? Because they're poorly suited ergonomically. So the detail must be very sharp. |
#27
|
|||
|
|||
Win95
On 8/27/2020 7:27 AM, Mayayana wrote:
"Chris" wrote | I wouldn't use anything less than 2560x1440 nowadays. | | Not much of a jump from 800x600 in 25 years! | | That's an 8-fold increase in area whilst shrinking the size, weight, energy | usage and improving the quality (analogue vs digital) of the monitor. I can't | imagine what kind of desk I'd need for a 27" CRT monitor. | I'm using a 27" monitor, at 1920x1080. More pixels doesn't mean more advanced technology. It's about the human body using the hardware. My eyes are getting old. So are mine, as well as the rest of me. I set up this monitor on a cabinet drawer slide, mounted under bookshelves over my desk. So I can pull it out or push it back. Whatever I find comfortable. It's 24-bit color and it's clear enough. How clear the image is has a lot to do with what monitor it is, not just its size and resolution. There might be scenarios where more concentrated pixels are useful, but that would mostly be on tiny screens, like phones. *Tiny* screens? A higher resolution means everything appears smaller. Putting it on a smaller screen makes no sense. Higher resolutions need *bigger* screens. Why? Because they're poorly suited ergonomically. So the detail must be very sharp. -- Ken |
#28
|
|||
|
|||
Win95
In article , Ken Blake
wrote: There might be scenarios where more concentrated pixels are useful, but that would mostly be on tiny screens, like phones. *Tiny* screens? A higher resolution means everything appears smaller. Putting it on a smaller screen makes no sense. Higher resolutions need *bigger* screens. actually, it makes a lot of sense. hi-dpi displays show content that is the same size, but significantly sharper and without any jaggies. the difference is dramatic, particularly for those with aging eyes. the same concept applies to printers. a high resolution 1200 ppi printer doesn't require a bigger sheet of paper than a low resolution 300 ppi printer or a very old 72 ppi printer. what it produces is a sharper print on the same size paper. |
#29
|
|||
|
|||
Win95
On 8/27/20 9:05 AM, Chris wrote:
On 26 Aug 2020 at 19:53:27 BST, "mechanic" wrote: On Wed, 26 Aug 2020 16:06:42 -0000 (UTC), Chris wrote: I wouldn't use anything less than 2560x1440 nowadays. Not much of a jump from 800x600 in 25 years! That's an 8-fold increase in area whilst shrinking the size, weight, energy usage and improving the quality (analogue vs digital) of the monitor. I can't imagine what kind of desk I'd need for a 27" CRT monitor. I had a 19" flat screen CRT monitor. IIRC, that weighed about 90 pounds. If we look at bandwidth the difference is even larger. 8-bit colour was common for 800x600 which is 3.8m bits. We can easily do 24bits @ 2560x1440 = 88.5m bits, nowadays. Nearly 25x larger. At 60Hz refresh that's 5.3Gbps. Try pushing that with 25-yo hardware. -- Mark Lloyd http://notstupid.us/ "Religion is all bunk." -- Thomas Edison |
#30
|
|||
|
|||
Win95
Mark Lloyd wrote:
I had a 19" flat screen CRT monitor. IIRC, that weighed about 90 pounds. Think of what you PAID for it too! I think I bought a light 22" for about $229 about 8 years ago. Haven't checked prices lately. |
Thread Tools | |
Display Modes | Rate This Thread |
|
|