![](https://aussie.zone/pictrs/image/2b4eeba8-7fdc-481d-8835-f459eaea2f28.jpeg)
![](https://beehaw.org/pictrs/image/c0e83ceb-b7e5-41b4-9b76-bfd152dd8d00.png)
Consider a summary statement though. Not a big fan of a link with no text.
Consider a summary statement though. Not a big fan of a link with no text.
Thank you for making Cunningham’s Law a reality.
I wonder if you would have made that post without it.
Seeing as you didn’t answer the person that asked for the info, it seems clear.
Removed by mod
https://universal-blue.discourse.group/t/best-way-to-install-a-vpn-on-universal-blue/134
The OSTree layering option worked for me, well… I can get it to run once I turn off the ovpn I still have sitting in Fedora’s network settings. @[email protected]’s comment I fumbled around with and I’ll wait to see if it updates.
Most of my time is spent in Linux Mint but if I ever have to reinstall, I’ll switch over to a ublue flavour.
Commenting from a laypersons’ perspective for new users, with my minor Linux experience and an inability to remember commands, don’t be frightened in giving it a go. If I can do it, anyone can. I run Fedora Kinoite on a second harddrive, use the BIOS Boot Menu to boot in, and then “rebased” to the UBlue Kinoite image using the provided commands once I read about it.
Almost everything is on Flatpak so I don’t even notice a difference with much. I had trouble layering the Mullvad VPN app (originally just using ovpn profiles) and I’m not sure I did it right in relation to updating but it seems to work.
Basically, I don’t understand much about it but it’s a completely usable operating system from my perspective.
Thanks for the write-up. It was helpful in increasing some knowledge.
Australian version:
It’s not fixed unfortunately. We are on 0.19.1 and having issues.
The fix is to regularly restart which clears the federation queue.
Beaver. Always beaver.
Australia is mostly degraded, channelised shallow creeks and erosion problems. Bam, beaver does all the work for us.
Can beavers survive in the subtropics?
Looks like Aussie.Zone has the most mentions of koalas.
https://www.search-lemmy.com/results?query=Koala&page=1&mode=communities
The Aussie Environment and Australia communities are probably the best placed to get started on koalas. I wouldn’t suggest making a new community yet until you need to, there are a lot of unused ones due to the lack of users. Koalas suit the theme of [email protected] (how do I know? I posted all the posts there).
Austria? Well, then. G’day mate! Let’s put another shrimp on the barbie!
The dream would be that I could sort them into a feedreader and select articles out but that won’t happen. Nor would I want a bot doing it either.
Unfortunately, no one else posts so for the meantime, I just have to keep going or let it die.
I made myself a thread of interests I add to over time:
I’m not bothered by it. Just joking around. I come here for things different from memes.
Stop lying!
Mine is in plants which a lot of models seem to struggle with. It’s not the science side, it’s the application side so with that, there is another layer of intelligence that the AI has to break through to appeal to me (answer my particular questions).
I tested it again with something even more particular and unique to an Australian plant and it was way off. I think I may have been one of the only people to ever post a particular technique to reddit and the AI mustn’t be searching in there as it didn’t even know about it even when asked directly. To its credit, it did give a good suggestion on who to contact to find out more.
Thank you. Will do.
I kept playing and tried the scenarios and was getting closer.
I don’t know if anyone will read this but I did further testing on perplexity when I got home. It’s probably not the right spot for it.
I tried a more trickier question and then I chose the available prompts to move forward (it suggests questions related to the original question if you are unsure how to prompt it next). The prompts were intelligent and were probably the next question I would assume I would ask if I were learning about this topic. On the next answer, it literally quoted something I wrote, almost word for word, on the exact subject which, according to me (of course) would be the correct answer.
I’ve never had an AI even reference a single thing I’ve written. I had prompted it into a general area where the things I had wrote existed so it should be expected but it made the connection almost instantly and answered the question 100% accurately.
As much as I hate it, well done Skynet.
Edit: After further testing, I can catch it out regularly enough but still, if I had to tell someone about the topic generally via email, I’d probably recommend it rather than me waste time typing it all out. I’ve just put myself out of a job.
I had an interesting result.
I proposed a simple question like I did all the other AI with “airoboros-65B-gpt4-1.4-GPTQ for 13 kudos in 369.6 seconds”. It was a bit of a wait, I understand why.
It gave me a word for word comment on what I assume is a blog post from a Melissa. The topic was related, just barely.
Which LLM do you recommend for questions about a subject? I looked in the FAQ to see if there was a guide to the choices.
Cheers for this. I tried a few of them while I’m waiting around and had one excellent result. I’m a near expert in one topic and I often test AIs against my knowledge for fun.
Perplexity.AI did the best I’ve seen; it sourced its arguments which, finally, weren’t wrong so if I needed to, I could actually learn more about what it was talking about. It’s not 100% but the other AI are so bad at this topic I test it on I always give up immediately.
I wouldn’t have seen it if it wasn’t for this post so thank you very much.
If I was any where else, it would be there. Maybe one day, won’t rule it out.
Using local on aussie.zone gets me Australian news so I dont need to subscribe to Aus communities which is a bonus. My subscribed just stays my interests so Local and Home are critical to my browsing.
https://universal-blue.discourse.group/c/bazzite/5
Discussion forum for the readers.