I did nothing and I’m all out of ideas!

  • 4 Posts
  • 166 Comments
Joined 2 years ago
cake
Cake day: June 11th, 2023

help-circle
  • Copy pasting from another community.

    To make it easier:

    Polly [:polly]
    Comment 21 • 4 hours ago
    
    As discussed on the attached patch, it doesn't sound like unified push is a direction we want to go in at the moment.
    So i'm going to close the bug, but do appreciate the interesting exploration and discussion it has generated.
    Thank you to everyone who contributed thoughts, time and code to this issue!
    Status: ASSIGNED → RESOLVED
    Closed: 4 hours ago
    Resolution: --- → WONTFIX
    

    The comment that was being referenced (I think):

    pollymce requested changes to this revision.Fri, Jun 27, 11:17 AM
    pollymce added a subscriber: pollymce.
    Comment Actions
    
    hello - thank you for looking into this and submitting this patch stack!
    I have asked around a few people internally but unfortunately, there doesn't seem to be a lot of support for the idea of moving to Unified Push.
    So I think it would be best to abandon the stack rather than spend more time working on this implementation.
    Sorry to have to say that and sorry it's taken so long to get back to you with this message!
    We appreciate your time and effort here and hope that you will still consider contributing to firefox in the future.
    

  • To make it easier:

    Polly [:polly]
    Comment 21 • 4 hours ago
    
    As discussed on the attached patch, it doesn't sound like unified push is a direction we want to go in at the moment.
    So i'm going to close the bug, but do appreciate the interesting exploration and discussion it has generated.
    Thank you to everyone who contributed thoughts, time and code to this issue!
    Status: ASSIGNED → RESOLVED
    Closed: 4 hours ago
    Resolution: --- → WONTFIX
    

    The comment that was being referenced (I think):

    pollymce requested changes to this revision.Fri, Jun 27, 11:17 AM
    pollymce added a subscriber: pollymce.
    Comment Actions
    
    hello - thank you for looking into this and submitting this patch stack!
    I have asked around a few people internally but unfortunately, there doesn't seem to be a lot of support for the idea of moving to Unified Push.
    So I think it would be best to abandon the stack rather than spend more time working on this implementation.
    Sorry to have to say that and sorry it's taken so long to get back to you with this message!
    We appreciate your time and effort here and hope that you will still consider contributing to firefox in the future.
    

    EDIT: Copy pasted the text for accessibility






  • 125g per person if you only eat that, 80 to 100 if you eat something else too

    Good brands you can usually find outside of italy with not outrageous prices are Rummo and Garofalo, less good but still okay, and at a generally lower price, you can usually find Voiello or De Cecco

    Barilla is the most common, but outside of their spaghetti I’m not a big fan

    There are a bunch of others, but these tend to be easier to find

    That’s all, thank you for coming to my TED talk





  • If it works from the phone it must be something PC related, you could try to post and take a look at the browser’s console for any error

    It is still probably some domain not getting resolved, it could be at the DNS level (are you using one with block lists integrated?)

    You could try changing it to 1.1.1.1 or 8.8.8.8 for a test, if you are not already using them






  • Giorgia Meloni is trying to position herself as a mediator between America and Europe, which honestly I think is a fool’s errand, but it is in line with her Atlanticist foreign policy.
    I would have personally preferred a real, firm, statement towards Ukraine’s support but until we’ll have people like Salvini - a Putin’s lover - and his ‘Lega Nord’ party in the government it is infeasible to expect it.

    At the same time FdI (Meloni’s party) has historically been pretty pro-ukraine, so I will be very surprised if something is not going to happen in the next few days.

    Remember that this quote is from Crosetto, the current Minister of Defence, just a few days ago: “Peace, however, must not mean humiliating a free people who have done nothing but die and sacrifice to defend themselves.”

    I’m not in line with their national policies - by a wide margin - but currently Meloni’s party and the Partito Democratico are the only pro-ukraine voices in Italian politics with some weight

    It’s depressing.





  • This was all theatre where nothing in reality changed, to show how tough and intransigent this admin wants to be: the real problem is that if formally and informally allied nations start to think that the USA would really commit to bullying as a real negotiation tactic, they will start to divest from it and expand their pool of partners to make these tactics less impactful.

    This could be disastrous in the long run, with decades of integration and diplomacy going in smoke. But in the short term there will be a lot of “winning”, as they keep saying. I guess.

    The first Trump term was seen as a short term derailing. The second one has a complete different international impact.

    Goodwill is a real diplomatic form of currency and it is getting burned at a real fast pace.


  • I’ve never used oobabooga but if you use llama.cpp directly you can specify the number of layers that you want to run on the GPU with the -ngl flag, followed by the number.

    So, as an example, a command (on linux) from the directory you have the binary, to run its server would look something like: ./llama-server -m "/path/to/model.gguf" -ngl 10

    Another important flag that could interest you is -c for the context size.

    This will put 10 layers of the model on the GPU, the rest will be on RAM for the CPU.

    I would be surprised if you can’t just connect to the llama.cpp server or just set text-generation-webui to do the same with some setting.

    At worst you can consider using ollama, which is a llama.cpp wrapper.

    But probably you would want to invest the time to understand how to use llama.cpp directly and put a UI in front of it, Sillytavern is a good one for many usecases, OpenWebUI can be another but - in my experience - it tends to have more half baked features and the development jumps around a lot.

    As a more general answer, no, the safetensor format doesn’t directly support quantization, as far as I know