Wrapping everything up!


At least pdfs can be read by open source tools. They do have their place, and I consider it much better form for users wanting to lay out graphics and text in specific fonts and places than those relying on html and then insisting on the invention of CSS and other DHTML wotsits.

I do miss the old days when no servers would take an email that was over the size of 10kB, but I'm much happier to open an included pdf than another bloody html only email. I've got reasonably adept at pulling out important links and text out of the unrendered html source code, so that I don't have to fire up any kind of potentially vulnerable browser just to display them.
 
nostalgic internet circlejerk yay!

I on the other hand am quite happy we've moved away, as a subculture, from the likes of geocities. While I'm not a huge fan of the chasing of one's own tail in terms of technical resources/requirements and having outdated technology before the dust fully settles, moore's law has still happened, and probably directly fueled by the concept of flashier/fatter content. I wouldn't want to imagine a world where technology was just like. "ok well, that's good enough, lets stop here"

If the internet stayed stagnant mid 80's/90's era and never evolved from that, or just went with the same level of fidelity but leaner, then the entire technological landscape would be significantly retarded in response, the content would have eventually gotten more optimized due to limitations and push to it's bounds, then pushing hardware to advance (through purchase power) would be at a much much slower rate. Advances we all enjoy today are probably directly because of the ever advancing requirements of unoptimized content constantly pushing the need for what the hardware was capable of and asking for more. The everyday consumer isn't intimately familiar with what they should actually expect from their hardware, and the concept of something amazing being able to ran on a toaster might be impressive to some, but is probably lost on the vast majority of the populace. While not a website, I remember being absolutely floored back in the day by the like of zsnes, and more recently drastic. People taking the time it takes to make something amazing that would be seemingly impossible when compared to the average much less optimized efforts.

Someday when moore's law hits the proverbial brick wall of the physics barrier (it's still running at full speed to) and potentially no longer able to go double the density at half the cost every couple of years, and going wider or taller has it's diminishing returns. Optimization will eventually again catch up with the old concept of throwing more resources at the problem. You see it on game consoles as they age. 1st wave of content is normally partially flashy and unoptimized, but as time goes on, you start seeing things like star ocean for the SNES happen. Like who in the hell would have thought you could do that much on the same chips that ran super mario world. True it was one of the last titles for the system, but coding skill, optimization, and familiarization show, you can do a whole lot once you know the ins and outs of a capability and learn to push the limits of what a no longer advancing system can offer. Those things take a lot more time, time we will have once we get to a point where the next major hardware advancement isn't just right around the corner. I look forward to that future, and it's a guarantee, but being able to throw another bigger log on the fire isn't and in fact someday... won't be possible..

So in summary, while we still have headroom for being able to throw resources at the problem, I say keep it coming, it's driving technology. It's unfortunate from the point of view of "look at what we were once capable of, it's all gone to shit!", and until we get to the point we can't just wait a year or two and buy something twice as awesome as we have today, I say let it be. But once we get to that point, then I'll be crying right next to you, clean up the code etc.
 
Websites are bad, but people & companies are often worse. Who hasn't received an email (usually from a school or other institution that should know better), with a PDF attachment containing a static GIF image of a few dozen lines of text?

The abstraction layers are getting out of control.

I know different people that to send away a PDF received by email, they print it on paper, take a picture of it with the phone, and send it using WhatsApp...

I asked them what was the point: "Because email is too difficult"
 
So in summary, while we still have headroom for being able to throw resources at the problem

I think you misunderstand the concept of progress. Progress is not always the advancement of complexity.

Most of the internet has become complex at the cost of efficiency and usability (the copy/paste-from-StackExchange generation of web development).

The spirit of Pyra and other projects is to encourage enthusiasm again to the raw aspects of computing: maintenance, upgradability, learning and fun.
[doublepost=1546346355,1546333900][/doublepost]
once we get to that point, then I'll be crying right next to you, clean up the code etc.

You really should take some advice from uncle Bob.
 
Your progress is never the advancement of your complexity. Your complexity is someone else progressing over you. They have made their life less complex by managing to make yours more complex.
Complexity is like entropy. In order to increase entalpy in a local system you increase entropy elsewhere. So complexity is the waste byproduct of progress.

The market ensures that the rich control technology and the poor pay for the technology that controls them. So the complexity you get is the waste the technology overlords throw on you. A decade or two ago
you could learn to make a website in a month, you could set it up in your basement and have it accessed through a dial up line. Now you need to accept the terms of very few companies that control
the dns, the content proxies, the networks, the content indexes and search, the rendering (browsers) and whatnot, and have to keep dancing the tune they play. For them it is simple to change conditions and standards, for you it's
restart all over.

Mobile phone vendors just buy closed IP from one another and throw shiny gadgets at you that sell your every breath to the highest bidder. The complexity of putting something together and selling it is simple (basically just get a budget).
The complexity of understanding it, maintaining it, keeping it runnig, keeping it barely secure... is thrown at the user. When the user is overwhelmed they sell a new device to the user. When the user brings their frustration to the neareast
technophile they are just criticized for being demodé. "Can't you see the new device is much better ? it even has NFC!"

The culture of accepting complex shiny things without understanding them is more like religion than technology. It is dangerous because it reinforces people that refuse to learn their very jobs and never think twice whether the time
they spent at it solves any real problem or creates more. I don't mean only technical jobs. The "email is too difficult" example is a very good one. If email is part of your job, just learn it, don't jump to shiny traps like whatsup (well,
I'm still in the old mindeset where a university could set up a mail server and mail was not a gmail monopoly, I guess now it is not really so much different than whatsup, still different but less each day).
"email is too difiicult " is like going to a pizzeria, ordering a pizza and being given a bare loaf of bread, "because pizzas are more difficult to prepare".

Modern websites are more annoying than old ones. Unwanted animations, huge images void of content, little text, little structure, hard to navigate, you almost have to wait until the content you look for happens to cross your viewport.
This is no longer hypertext, it is television. But of course with more tracking, miners, and credential stealing malware than (traditional) TV. Web design is "optimized" for devices that simply don't have the screen real state to browse the web, interaction
without any more input device than a fat fingers touchscreen and not even a box to show the URL you're at. The experience is worse than in the 90s and the equipment required is more complex, shorter lived and less under my control,
but somehow I am expected to love "the progress" because now the squares have rounded corners and everyone typefaces in Roboto. There's no technology progress, there's technology companies progress, which is almost the opposite.

Sorry for the new year rant.
 
I think you misunderstand the concept of progress. Progress is not always the advancement of complexity.

Most of the internet has become complex at the cost of efficiency and usability (the copy/paste-from-StackExchange generation of web development).

The spirit of Pyra and other projects is to encourage enthusiasm again to the raw aspects of computing: maintenance, upgradability, learning and fun.
[doublepost=1546346355,1546333900][/doublepost]

You really should take some advice from uncle Bob.
Not entirely sure how that ties into what I was saying, or maybe the intent of what I was saying wasn't clear. I'm not suggesting the internet (or applications & programs) is progressing for the better. But more the point that the SIDE EFFECT of that progress is ever advancing hardware requirements, and THAT is actually progressing for the better.

If "the problem" of the average website being bloated inefficiently coded and making it so cell phones and pc's need 8GB+ of ram and a supercomputer to be able to not turn into a slug while using them. I am suggesting it's fine, and welcomed, because you now have a market where supercomputer processors and tons of ram are being developed and priced for the consumer and forcing the price of everything below it down. You can improve or totally gut/replace code, hardware on the other hand is a bit harder to accomplish. Efficient code would have not gotten me my pocket super computer, at least not for many years. That is all I'm really saying.
 
Efficient code would have not gotten me my pocket super computer, at least not for many years. That is all I'm really saying.

Post hoc ergo propter hoc.

I'd say the opposite: if efficient code and practices didn't exist then faster hardware would have been much harder to attain. It's practically impossible to make technical innovations while being lazy and slapdash. Try maintaining a medium sized application for more than 6 months or so without using sound, clean coding practice, and try telling me again that it is helping progress and innovation.

I also argue that hardware innovation has been driven predominantly for gaming / entertainment, which has, as a side product (and indirectly), helped exacerbate the current state of the internet.
 
Last edited:
How about setting-up a box / vps just running a browser per X login. Donation to server costs gets forum members access. Connect via some remote-desktop app. Anyone tried this? All the javascrap woulde run on the vps and you might be able to hack-together a usable experience...
 
When I accessed the 'basic internet' of text and maybe a couple of images, I was using a dial-up modem, and pages would typically take longer to load than just about any website now, back then, in the UK, there was no faster way to access the internet from home, so it was the same for everyone. Back then, hyperlinks would often go to a 404, images were often missing (similar reason), layout standards hadn't really been established so every site was different and it was often hard to find things, etc. I could get a download speed of around 3 KB/s, so just having a single 30 KB image (which is tiny) would take 10 seconds to load.

Sure there was something nice about the basicness of it, the fact the source of the web pages would just be what was required to display the information, rather than analytics, tracking, adverts, etc. But on the flipside, we now enjoy insanely large amounts of content, free of charge, which is achieved by throwing all this other stuff at the user. There is a bit of a push at the moment for sites to offer the option to pay and avoid the adverts etc. and only time will tell how it all pans out, but I'm predicting that 'free' will continue to dominate. With advert blocking and script removal, it is possible to get a somewhat refined internet which isn't so bad to use!
 
Not entirely sure how that ties into what I was saying, or maybe the intent of what I was saying wasn't clear. I'm not suggesting the internet (or applications & programs) is progressing for the better. But more the point that the SIDE EFFECT of that progress is ever advancing hardware requirements, and THAT is actually progressing for the better.

If "the problem" of the average website being bloated inefficiently coded and making it so cell phones and pc's need 8GB+ of ram and a supercomputer to be able to not turn into a slug while using them. I am suggesting it's fine, and welcomed, because you now have a market where supercomputer processors and tons of ram are being developed and priced for the consumer and forcing the price of everything below it down. You can improve or totally gut/replace code, hardware on the other hand is a bit harder to accomplish. Efficient code would have not gotten me my pocket super computer, at least not for many years. That is all I'm really saying.

1) That's erroneous Keynesian logic. "Because we built something badly, we can sell more hardware." is ethically equivalent to "We made the part to wear-out quickly to sell more units." or "We don't talk about the side-effects because they induce patients to buy our other more expensive medication as well." or "We just, you know, steal a bit here and there to boost our alarm system business."

2) There are plenty of applications hungry for as much CPU/GPU power as we can throw at them - Video games have been a larger market than the movie industry for for a couple decades now.

3) Slapping the label 'Progress' on something (for e.g. a millionfold decrease in utility / CPU cycle) is not an argument: it's a rhetorical ploy, no more.
 
Last edited:
When I accessed the 'basic internet' of text and maybe a couple of images, I was using a dial-up modem, and pages would typically take longer to load than just about any website now, back then, in the UK, there was no faster way to access the internet from home, so it was the same for everyone. Back then, hyperlinks would often go to a 404, images were often missing (similar reason), layout standards hadn't really been established so every site was different and it was often hard to find things, etc. I could get a download speed of around 3 KB/s, so just having a single 30 KB image (which is tiny) would take 10 seconds to load.

I don't know when you were on the "basic" internet, but layout standards were pretty well established by gopher. Granted, we didn't keep using that, we wanted to go do whatever we wanted to, so we did.

The Internet was much more interesting when sites were different, and people put content first instead of discoverability.

As for speed, considering that Internet connections are insanely faster these days... Sure, a website with not much on it might take 30 seconds to load back in the day, but today, most websites, even ones that are pretty basic, still can be measured in seconds for how long it takes to load. So, if we consider you were maxing out at 3KB/s, and I've seen modern download speeds over 30MB/s... Something is wrong, proportionally.

Though, I suppose that's largely due to giant libs being pulled in just to use a few functions.

I'm not sure what my point was here, but I think it's that gopher was cool and organized, and the web isn't fast enough considering modern connections and computer power proportionally.

Oh, that's right, it feels like everything these days is made at the expense of responsiveness.
 
I don't know when you were on the "basic" internet, but layout standards were pretty well established by gopher. Granted, we didn't keep using that, we wanted to go do whatever we wanted to, so we did.

Were they? Granted, the only software I have installed that will still download a gopher link is lynx (the old text mode browser), so I've never noticed most layout features beyond 8-space tabs. Reading up on it, gopher handles text either as plain text or html text, which won't let you do a 30-70% layout or anything like that unless you add css on top, or abuse tables. Apparently the center tag was still there in xhtml 4.01 but got deleted in html 5.

The amazon website annoys me these days. It loads pages pretty quickly but it can take up to a minute (60 whole seconds) to load fully for me, so if you want to zoom into a product image you have to wait. Even the reviews only turn up after about 30 seconds.
 
What I worry more is Debian. The software is pretty much outdated.
I could not compile a Qt application I developed at University because it needed a class that was available in Qt 5.8. Debian has 5.7 in its repo.

No need to worry, just upgrade to buster which currently has Qt 5.11. The freeze is in the coming weeks, so Debian 10 should be released in about half a year. Plenty of time before the Pyra comes out.

It is called a web-ring, get it right.
I started out writing for Holarse (German Linux gaming site), which at the time was in a web-ring called The Planet if anyone remembers that. :-D
It was organised by mmv1, who some might know from the Tux Racer fork PlanetPenguin Racer which incidentally was originally just called PPRacer but the Debian maintainers thought that sounded too immature, like being at the whim of Walmart as a game developer..
 
I also have a Gemini. I use it solely with Debian. It does the job well enough for me. I use it every day as a much more portable laptop for email and writing. It has replaced my Pandora.

However, I am looking forward to the Pyra for the openness of the design, both in software and hardware. Beyond the comments that ED has made (keyboard and screen), all of which I agree with, I want the selection of ports and slots that the Pyra has, in particular. The Gemini is seriously lacking in this regard.

Sent from my Nexus 7 using Tapatalk
 
I also have a Gemini. I use it solely with Debian. It does the job well enough for me. I use it every day as a much more portable laptop for email and writing. It has replaced my Pandora.

However, I am looking forward to the Pyra for the openness of the design, both in software and hardware. Beyond the comments that ED has made (keyboard and screen), all of which I agree with, I want the selection of ports and slots that the Pyra has, in particular. The Gemini is seriously lacking in this regard.

Sent from my Nexus 7 using Tapatalk

The upcoming Cosmo takes care of several of the Gemini's shortcomings - but lack of ports does not appear to be one of them. Also, there is no news on whether the Cosmo will somehow overcome the USB C failings of the Gemini. Two ports - HDMI adapter only works in the right one. Charging only works in the left one. USB C hub only works in the left one. Cannot charge through the USB C hub. Potential use as a 'desktop replacement' effectively crushed. I really hope they fix that for the Cosmo - it is my #1 nagging issue with the Gemini.
 
You've reminded me that the one really serious problem with the Gemini with Debian is that the HDMI adapter is not recognised at all. One of my intended use cases for the Gemini was to use it to give presentations when travelling. Definitely waiting for the Pyra!

Sent from my Nexus 7 using Tapatalk
 
You've reminded me that the one really serious problem with the Gemini with Debian is that the HDMI adapter is not recognised at all. One of my intended use cases for the Gemini was to use it to give presentations when travelling. Definitely waiting for the Pyra!
Is Pyra's HDMI-out working?

Sent from my Nexus 7 using Tapatalk
Can you disable that spam in Tapatalk settings... please?
 
Back
Top