• 1 Post
  • 159 Comments
Joined 2 years ago
cake
Cake day: June 10th, 2023

help-circle
rss

  • The fact that that made the news shows that that’s not a common occurance, not to mention that it was the guy’s own government who snatched his phone, not a foreign spy agency.

    Trump has been attacking anyone who supports Palestine regardless of if they are crossing the border or not. People really should be cautious when travelling to countries with authoritarian governments but that’s not because they’re crossing a border or are in a foreign country, it’s the same caution that needs to be observed by anyone living in such a regime





  • Ask Claude to add 36 and 59 and the model will go through a series of odd steps, including first adding a selection of approximate values (add 40ish and 60ish, add 57ish and 36ish). Towards the end of its process, it comes up with the value 92ish. Meanwhile, another sequence of steps focuses on the last digits, 6 and 9, and determines that the answer must end in a 5. Putting that together with 92ish gives the correct answer of 95.

    when Claude was given the prompt “A rhyming couplet: He saw a carrot and had to grab it,” the model responded, “His hunger was like a starving rabbit.” But using their microscope, they saw that Claude had already hit upon the word “rabbit” when it was processing “grab it.”

    … [turned] off the placeholder component for “rabbitness.” Claude responded with “His hunger was a powerful habit.” And when the team replaced “rabbitness” with “greenness,” Claude responded with “freeing it from the garden’s green.”









  • These are the answers they gave the first time.

    Qwencoder is persistent after 6 rerolls.

    Anyways, how do I make these use my gpu? ollama logs say the model will fit into vram / offloaing all layers but gpu usage doesn’t change and cpu gets the load. And regardless of the model size vram usage never changes and ram only goes up by couple hundred megabytes. Any advice? (Linux / Nvidia) Edit: it didn’t have cuda enabled apparently, fixed now