• 0 Posts
  • 50 Comments
Joined 1 year ago
cake
Cake day: June 11th, 2023

help-circle
  • I have been encountering it more lately, but that’s because of the types of sites I was using.

    The ones that may not work tend to be; banking (usually okay though), work-related (ranging from applications to gig work to job specific), and then if you happen to run into something that requires chromium as a way to function, such as some specific extensions or most functional web music creation tools, like MIDI support.

    B-b-b-buuuuut I only use Firefox and all my stock and banking sites work fine on FF, those job sites that needed chromium can get by with Edge, and if you’re using web browsers for MIDI tools, really, what are you doing?







  • This is real time and based from one image.

    Deep fakes up to this point are generally not real time, and are generally trained on the source, then with different methods can be applied to the video. Say, making Kermit the Frog doing a dance as the final video, but it’s been deep faked to look like Ms. piggy.

    There are tons of examples of AI that post process deep fakes. This is one of the few real time ones that you can link to a webcam, have a single photo, and you are the deepfake.

    From my understanding, that hasn’t been done yet, at least not in the AI spaces I’ve been part of.



  • averyminya@beehaw.orgtoTechnology@lemmy.mlCut the 'AI' bullshit
    link
    fedilink
    arrow-up
    4
    arrow-down
    2
    ·
    1 month ago

    We’ve had the tech to drastically cut power consumption for a few years now, it’s just about adapting the existing hardware to include the tech.

    There’s a company MythicAI which found that using analog computers (ones built specifically to soft through .CKPT models, for example) drastically cuts down energy usage, is consistently 98-99% accurate, simply by taking a digital request call, converting it to an analog signal, the signal is processed then converted back to a digital signal and set to the computer to finish the task.

    In my experience, AI is only drawing 350+ watts when it is sifting through the model, it ramps up and ramps down consistently based on when the GPU is utilizing the CUDA cores and VRAM, which are when the program is processing an image or the text response (Stable Diffusion and KoboldAI). Outside of that, you can keep stable diffusion open all day idle and power draw is marginally higher, if it even is.

    So according to MythicAI, the groundwork is there. Computers just need an analog computer attachment that remove the workload from the GPU.

    The thing is… I’m not sure how popular it will become. 1) these aren’t widely available and you have to order them from the company and get a quote. Who knows if you can only order one. 2) if you do get one, it’s likely not just going to pop into most basic users Windows install running Stable Diffusion, it’s probably expecting server grade hardware (which is where the majority of the power consumption comes from, so good for business but consumer availability would be nice). And, most importantly, 3), NVIDIA has sunk so much money into GPU powered AI. If throwing 1,000 watts at CUDA doesn’t keep making strides, they may try to obfuscate this competition. NVIDIA has a lot of money riding on the AI wave and if word gets out that some other company can cut costs of development both in cost of hardware and cost of running it, and the need for multiple 4090s or whatever is best and you get more efficiency from accuracy per watt.

    Oh, and 4) MythicAI is specifically geared towards real time camera AI tracking, so they’re likely an evil surveillance company and also the hardware itself isn’t explicitly geared towards all around AI, but specific models built in mind. It isn’t inherently an issue, it just circles back to point 2) where it’s not just the hardware running it that will be a hassle, but the models themselves too.


  • Google pays a lot to stay the default browser.

    The other search engines mostly use overlapping indexes.

    Said search engines are also not anywhere near competition to Google.

    Quite frankly, I can only think of 4. DDG, Ecosia, Bing, and Kagi.

    Most people don’t know about Ecosia or Kagi. Most people hardly even know about DDG.

    I wouldn’t consider YouTube as much of a monopoly because despite it being mostly the only one, from what I understand they haven’t paid out to stay the only one, and don’t really leverage market dominance against others (they probably do but I just don’t hear about it often.) The main reason alternatives don’t exist is simply because of the mass amount of data the YT needs



  • Gait is walking stance/personality of walking. So if someone walks with a limp, favors one side, slumps a shoulder, etc. Everyone’s is different, so it can be tracked.

    This is very extreme surveillance avoidance, though anyone ditching their phone to go to a protest should also think about their gait and where they are coming to and going from. If you’re at this point of fear, you’d want a change of clothes in a bag, and maybe even different shoes. In like, 98% of U.S. situations though this doesn’t apply to us or our protests, it is just good to know. The night I’m talking about though and one other, it was honestly necessary.

    Other than extreme circumstances though, these are tactics that only people who have legitimate reason to be paranoid. Such as Boeing whistleblowers, or journalists.


  • I mean, masking during Covid while there were a lot of protesting events going on was a common crossover. Protect your identity and your health.

    Unfortunately these spies do not need just faces for recognition. Gait is analyzed as well, among other things, so I’m not sure how effective it really is.

    Side note: if government surveillance wants eyes on us, and masks prevent surveillance, would they have pushed masking for Covid? I don’t really feel one way or the other on it, just a conspiracy curiosity.


  • That’s simply not true, there are ways to drastically reduce energy usage while increasing efficiency by offloading the work. A company Mythic AI has worked on an analog processor which sifts through the model. On GPU’s this is the power hungry process, for example a PC with the NVIDIA 3080 will typically run at about 350w under load.

    Their claim now that these analog chips use 1/100th of the energy needed for GPU’s. There’s a video from Veritasium that goes over the details. It’s genuinely effective, and that was a few years ago now before whatever potential growth they’ve made with their recent funding. It looks like they actually have products available for inquiry now too.

    Doesn’t seem to be at the consumer level yet unless you want to use servers for AI vs. your home computer, but it’s progress. Here’s the thing, I’m not particularly for our current implementation of AI but I don’t think we should be entirely against all of it either. There are clearly plenty of benefits that people see from them, so giving any option possible for companies like Google to severely draw back their energy consumption seems like the reasonable path forward.

    The independent drawbacks to LLMs and generative AI don’t mean the technology will stop getting used. It isn’t going anywhere (as in, people will use it) so making it more efficient is the obvious solution to mitigating more waste. Advocate for the prohibition of AI, but it’s honestly more reckless than advocating for making the business’ usage of AI reach a specific energy goal. Forcing these companies to retrofit their servers to run at something ridiculous like 30w per rack is beneficial for them and for us, as they won’t pay as much for energy and we all will have less of it wasted.

    Wishful thinking of course, but my point is that energy efficient AI, fortunately or unfortunately, exists and it will continue to. Like we can run “AI” on a raspberry pi 4 which takes what, 9 watts? This technology will get more developed every year, and while I’d be extremely surprised to see a Pi4 on its own running a subjectively useful LLM, I can imagine a setup that uses a Pi and some offloading tech to achieve reasonable results.

    I’m personally pretty fine with regular people with computers wanting to use AI in whatever way suits them, as long as they aren’t trying to sell the results. While the energy consumption isn’t ideal, it’s a droplet to the servers these companies take. We should definitely make every effort possible towards increasing the efficiency of this tech, if only because it seems insane to me to pretend like AI will just disappear, or let this huge energy suck exist as we hope it begins to fade.

    Tl;Dr offload GPU resources to analog chips, force companies to be more efficient simply because hoping AI is going to disappear is reckless.



  • Quality over quantity.

    1. Meta is so well known for having good moderating. (/s)

    2. Meta is so well known for promoting posts that are active hate-speech. (For example, CW in Link: suggested “Threads” posts on Instagram have shown transphobic posts to me Which kind of goes back to point 1, terrible moderation. Btw, my partner is involved with Queer Activism on facebook and so it’s not like I am being targeted for hateful ads. This is just what they decided to promote, probably because it got a lot of comments and shares. Oh, why do we want Threads users who are actively sharing this rhetoric? Seems antithetical to the entire concept that the fediverse was founded on.

    3. What happens to the rest of the fediverse when it’s overrun by millions of Threads users, hundreds of thousands of them promoting this sort of content? All defederated instances will now have to pick and choose - something we already do, but I would say we only need to look at Lemmy.World to see why this is a bad thing, as imagine Threads communities become the regularly used ones, so now any instances that defederate don’t have access to the most active community. In turn, this either kills the defederated communities by keeping these communities small, or actively encourages those new to the fediverse to just join Threads since it has “the most active” communities.

    4. Now that there are millions of threads users, what happens to smaller instances that are now being overrun by traffic that their server couldn’t handle, or malicious users on Threads - with Lemmy’s moderation tools this can be a cumbersome and difficult process since, from my understanding, this becomes a case-by-case situation for the Instance Moderator, all while the Threads Moderating Team will likely do nothing and ignore the inflammatory users. From my understanding, you can have 1 Threads account per Instagram Profile, and users can have 5 Instagram Profiles. Obviously, this is also a Lemmy issue, but with Instance Admins having control over their users, Threads as an Instance Admin historically hasn’t seemed to be great.

    5. The Fediverse is some ~1.5m users. Threads is already 100m. As mentioned about server load, there’s also just the entire idea of it being so big that it naturally becomes a vital resource. E1) Extend. As it becomes widely used, Meta starts taking an interest in the future of ActivityPub. E2) Embrace. And finally, now that it is established and smaller instances are either defederated or have some form of, effectively a shadowban, all that is realistically left is Threads content. E3) Extinguish.

    Is the fediverse being more accessible a good thing? Absolutely, not many are arguing that. The idea is that Threads gets so big that ActivityPub either can’t exist without Threads, or Threads leeches the userbase from the rest of the Fediverse. Someone you like is on Threads but not the rest of the Fedi? Well, why have a Lemmy.ML account when you can just have your Threads account?

    Before you know it, we’re back to only having one website again for all of our social media needs.




  • Yeah it’s just a distraction like playing music/water sounds or getting tickled. Honestly, I’ve put an electric massager to my neck/head and the hum relieves the tinnitus pitch a bit. It seems like this is the tongue-version of that, but since it isn’t as loud they’re pairing it with some sound relief.

    A neat idea, I’m glad that it helps people who can afford it. Hopefully it can be priced more reasonably in the future. In the meantime I will have to keep my headphones handy! lol


  • I believe I’m talking about the Sony CRE-C10 Over the Counter Hearing Aids. I heard about them from an article right before they hit the market They’re like $1,200 I think, but they’re effectively just Bluetooth hearing aids. They don’t have any particular qualities that make them good for tinnitus, just as I mentioned before it’s about just hearing something that isn’t silence so that you’re able to focus on something that isn’t the tinnitus you’re hearing. It should be noted, Sony themselves explicitly say they do not help with tinnitus, which is likely as true as me saying regular headphones don’t “help treat” tinnitus. However, I am pretty much crippled without headphones if I have a really bad flare up.

    I use almost the inverse of these, the Sony LinkBuds (and S series). These are Bluetooth earbuds that have a gap in the ear canal so you can hear the world around you. The LinkBuds S are closer to a standard pair of earbuds with the noise cancelling or pass-through sound options, which is over-all nicer due to being able to inherently block out sounds from the bus. Anyway all this to say, I only mentioned them because they’re pretty similar to how I use my headphones.

    I can’t speak on how the CRE-C10’s are or how effective they might be for my style of tinnitus, I’m merely making assumptions!