• 4 Posts
  • 107 Comments
Joined 3 years ago
cake
Cake day: July 4th, 2023

help-circle

  • Simple answer is no but…

    Stock android, like all commercial OS is inherently spyware. Google does have access to it and in theory could do anything, but that is only “in theory” because as far as we know stock android does not come with keyloggers and data exfiltration tools, it spies on you in the way of “telemetry” meaning that Google decided that certain data is useful and so they “anonymize” it and collect it, this data can be: wifi networks, location, phone usage, and more.

    So in theory it is possible that stock android either already has spyware to collect personal app data that no one ever noticed (very unlikely) or that google will push an update with such software (somewhat unlikely)

    Now if you use other Google apps, especially gboard and google assistant, you are definitely sharing SOME amount of peesonal text with google.

    The reality is that you should consider your threat model, which means to consider what kind of risk you are willing to take and what kind you are willing to make a change to avoid. It is perfectly reasonable to say that you are not willing to use gboard or google assistant, but you are willing to use the stock android, understanding that you are sharing some data with Google, but most likely no app data (such as your texts in Signal)

    Same thing about choosing a messenger. WhatsApp is made and managed by Meta, a company that lives off of user data. So even though WhatsApp claims (and seems to really be) end to end encrypted, you can still be sure that Meta is collecting everything they can, which probably means: who you are texting, how much, at what time, how much you use the app, location, and much more. Signal is open source and managed by a non-profit that does have a good track record, and because ut is open source you can also choose a different client (like Molly) which further reduces the Signal Foundation’s hold on your chats (if you fear that) So you could say that because all of your friends use WhatsApp you are willing to accept that Meta will collect a bunch of data on you, or you could decide that you are not okay with that data collection and therefore choose Signal. It is up to you. In any case, E2EE is a must as it protects you from unauthorized access from hackers.


  • Buying new: Basically all of the integrated memory units like macs and amd’s new AI chips, after that any modern (last 5 years) gpu while focusing only on vram (currently nvidia is more properly supported in SOME tools)

    Buying second hand: not likely to find any of the integrated memory stuff, so any GPU from the last decade that is still officially supported and focusing on vram.

    8gb is enough to run basic small models, 20+ for pretty capable 20-30b models, 50+ for the 70b ones and 100-200+ for full sized models.

    These are rough estimates, do your own research as well.

    For the most part with LLMs for a single user you really only care about VRAM and storage speed(ssd) Any GPU will perform faster than you can read for anything that fully fits on it’s VRAM, so the GPU only matters if you intend on running large models at extreme speeds (for automation tasks, etc) And the storage is a bottleneck at model load, so depending on your needs it might not be that big of an issue for you, but for example with a 30gb model you can expect to wait 2-10 minutes for it to load into the vram from an HDD, about 1 minute with a sata SSD, and about 4-30 seconds with an NVMe.



  • It’s very very unlikely that your TV and your device connected to it both support and enable ethernet over HDMI by default. But if you are unsure you can test it by connecting and seeing if the TV is getting a connection.

    Personally I also opened my TV and disconnected the wifi card since in theory the TV could also just try to connect to any open wifi in the area without me knowing, but to each their own threat model.






  • It is not as good as a decentralized system, and even though the server is open source, it isn’t self hostable (technically in an intranet you could but not easily)

    But the signal foundation is a non profit with external audits and a proven track record with law enforced requesting data and getting basically nothing (If i remember correctly they only have your user to phone number relation and the last time you were online)

    So although it is imperfect, it is an amazing solution that is almost the only 1:1 competitor to whatsapp/messenger/imessage that is privacy respecting, so I am very grateful for it’s existence.


  • Just like any foss project, there some level of trust if you are going with the main distribution. In theory you are correct that not much is stopping them from releasing a malicious update, but because it is open source, soon enough people would notice that either they released new code that is malicious, or that the new version does not match the source code. That kind of scenario is known as a supply chain attack.

    Since the code is open, you can literally read it for yourself to see exactly what the apk does. You can also fork it and modify it however you like, just like the creator of Molly did (Molly is a fork of the Signal client that adds some security features)



  • MTK@lemmy.worldtoSelfhosted@lemmy.worldProxmox or Docker?
    link
    fedilink
    English
    arrow-up
    5
    ·
    7 months ago

    There are a few reasons why someone might use Proxmox. It doesn’t have to be just security, it can also be different network architectures that don’t work as well in Docker and it can also be just greater control over the services which is less comfortable to do in Docker as it’s meant to have built images that are running and are ephemeral. There are also certain services that either don’t have a pre-built docker and someone might not want to bother with making their own docker infrastructure around it or have technologies that are not well supported or are not well executed in docker.

    There is also the fact that Proxmox is meant to be used in production, which means that it’s more stable (than some casual docker rubning on whatever distro they have) and it does have a very low overhead, even if you do use dockers you can use them within Proxmox and it gives you a lot of capabilities that add to stability and manageability.

    Generally speaking if your threat model is very small, you’re running this within your private network, and it’s not exposed to the internet or anything large like that, then it doesn’t really make a big difference and you should probably just use whatever is comfortable for you.

    I personally moved to Proxmox for three reasons which are security, customizability and stability. I felt that within Docker containers it was a lot more annoying to have to pull the images and make my own Docker files and update them and build them every time. I find it easier to have my own server with its dedicated service and that I know how to update and how to modify more properly and that I built from scratch. There is also the advantage that I can use whatever OS I want for different situations. Of course I personally use exclusively Linux but even within that I can use different distros and I can have all kinds of different services running without interfering with one another in any way, and in extreme cases I can have a windows vm.

    And another major factor for me was that I just wanted to learn how to do it. I think it’s cool and it was interesting and I have already experienced Docker to a level that I felt comfortable with it and it was time to move on and expand my horizons.




  • MTK@lemmy.worldtoSelfhosted@lemmy.worldHome server advice
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    7 months ago

    Tip, if you have the room for it, looking for second hand servers (as in actual servers with server hardware) is often really useful.

    As you start hosting more stuff you realize that ram and cpu cores are very limited in consumer hardware. With a shitty second hand server you could have more cores and more ram than anything in the consumer category, and you can stick an old GPU on it if you want some better media performance.

    But if you truly believe that you won’t spread out and that potentially 64gb ram and 8 cores will suffice, just go ahead and build it however you want. It is no different from a regular build. Get a nice ssd, get a wired ethernet connection and you are like 90% of the way there.

    Edit: everyone else is giving much better advice, ignore my overkill here. For media and simple game servers with a low energy consumption target you are probably better off with a mini pc with an integrated gpu or if you want to future proof a bit, maybe one of those unified memory ones where you ram is also the vram and can produce pretty good performance.