• 0 Posts
  • 27 Comments
Joined 2 years ago
cake
Cake day: June 2nd, 2023

help-circle



  • I am one of those. I ditched Signal and went back to the stock sms app and adopted matrix. Haven’t looked back since. The reality is that Signal dropping support for sms wasn’t going to stop me from using SMS. For that, other people need to be convinced to stop using it at the same time. Signal didn’t have nearly the market size needed to make that happen. And now that card is played, and nothing has changed. Signal is just another messaging app among hundreds. At least matrix offers a real paradigm shift.






  • Sounds like neither of you watched the video. Fortunately, I did so here’s a quick summary. The thesis is that music is getting worse, for a few reasons. Author argues:

    • Auto tune and other modern digital sound production tools are overused to correct pitch and timing, making music too synthetic. Real music has imperfections that makes music just sound more artificial. Basically, taking the human element out of it.
    • Streaming has cheapened the value of a single song because of how easy it is to skip to another song. So arguably it is not technically just worse music, it’s our appreciation for it.

    The first point has been touched on by many other people. It’s a common trend in a lot of places outside of music too. People are replaced with machines and processes in a lot of settings especially in corporations and commerce, and while that’s great for efficiency and predictability, it creates a sterile landscape devoid of human expression. This is not to say all music has this. But mass market music is a chief culprit.

    The other point really resonates with me with videogames and videogame sales. You can get a dozen great steam games for the same price as a single Nintendo title, yet I probably put 10x the time into that one Nintendo title than all the other steam games combined. Had to get every bit of value out of that expensive Nintendo purchase. YMMV on this point though. I don’t stream music so I can’t say how it has affected me personally.




  • The wording of the article implies an apples to apples comparison. So 1 Google search == 1 question successfully answered by an LLM. Remember a Google Search in layspeak is not the act of clicking on the search button, rather it’s the act of going to Google to find a website that has information you want. The equivalent with ChatGPT would be to start a “conversation” and getting information you want on a particular topic.

    How many search engine queries, or LLM prompts that involves, or how broad the topic, is a level of technical detail that one assumes the source for the number x25 has already controlled for (Feel free to ask the author for the source and share with us though!)

    Anyone who’s remotely used any kind of deep learning will know right away that deep learning uses an order of magnitude or two more power (and an order of magnitude or two more performance!) compared to algorithmic and rules based software, and a number like x25 for a similar effective outcome would not at all be surprising, if the approach used is unnecessarily complex.

    For example, I could write a neural network to compute 2+2, or I could use an arithmetic calculator. One requires a 500$ GPU consuming 300 watts, the other a 2$ pocket calculator running on 5 watts, returning the answer before the neural network is even done booting.




  • These freedoms are a strength indeed, but they are also a vulnerability that can be exploited by foreign powers. Freedoms remain free so long as the people exercising those freedoms do so responsibly. I think a lot of people in the US do not exercise this freedom responsibly. I think a lot of Americans are being manipulated into voting in autocracy. Ironically.

    Complete and total freedom is just anarchy, and anarchy collapses on itself and turns into autocracy.





  • This article replaces the “Google is cracking down on ad blockers” mantra with “Google is consolidating control by restricting general purpose computing as the model of security”.

    Honestly, I’m not sure this is a better look. It’s true that this is “more secure”, in the sense that it limits the power afforded to malicious extensions, but it completely ignores the collateral damage. It strips the power individuals have to enact their own policies, instead having to go through Google to accomplish the same thing.

    Honestly, this is just another step in the direction of WebDRM and centralized control. This is more erosion of what made the Internet great. It’s just one more step of turning the Internet into a TV set.

    Fuck. This. Shit. Give me back web 1.0.


  • Phone unlock. Is unlocking a phone unethical? Categorical no.

    Facial recognition is a tool. And like with any other tool there are always ways in which it can be used for good and for bad. In fact I can’t think of a single tool, guns and nuclear bombs included, that don’t have some potential uses for good, in addition to bad. In fact, you might say that the very definition of a tool is that it has a desirable application, and a good use is merely a desirable application where the collateral damage of it’s use is contained or offset by the benefit.

    Perhaps what you mean to say is corruptible? That is to say that use of the tool tends to devolve into other unethical uses and consequences? I might be in agreement with you on that one.