Sorry Python but it is what it is.

    • @barsoap@lemm.ee
      link
      fedilink
      92 years ago

      cached copies of crates that you downloaded

      Meh, what else is it supposed to do, delete sources all the time? Then people with slow connections will complain.

      Also size-wise that’s actually not even much (though they could take the care to compress it), what actually takes up space with rust is compile artifacts, per workspace. Have you heard of kondo?

            • @Anafabula@discuss.tchncs.de
              link
              fedilink
              22 years ago

              You can globally share compile artifacts by setting a global target directory in the global Cargo config.

              In $HOME/.cargo/config.toml:

              [build]
              target-dir = "/path/to/dir"
              

              The only problems I had when I did it where some cargo plugins and some dependencies with build.rs files that expected the target folder in it’s usual location.

    • @Pipoca@lemmy.world
      link
      fedilink
      42 years ago

      Python virtual environments feel really archaic. It’s by far the worst user experience I’ve had with any kind of modern build system.

      Even a decade ago in Haskell, you only had to type cabal sandbox init only once, rather than source virtualenv/bin/activate.sh every time you cd to the project dir.

      I’m not really a python guy, but having to start touching a python project at work was a really unpleasant surprise.

    • @hatchet@sh.itjust.works
      link
      fedilink
      62 years ago

      I actually vastly prefer this behavior. It allows me to jump to (readable) source in library code easily in my editor, as well as experiment with different package versions without having to redownload, and (sort of) work offline too. I guess, I don’t really know what it would do otherwise. I think Rust requires you to have the complete library source code for everything you’re using regardless.

      I suppose it could act like NPM, and keep a separate copy of every library for every single project on my system, but that’s even less efficient. Yes, I think NPM only downloads the “built” files (if the package uses a build system & is properly configured), but it’s still just minified JS source code most of the time.

      • @Espi@lemmy.world
        link
        fedilink
        52 years ago

        With python and virtualenv you can also keep the entire source of your libraries in your project.

    • Sören
      link
      fedilink
      362 years ago

      npm has a lockfile which makes it infinitely better.

      • bjorney
        link
        fedilink
        202 years ago

        pip also has lock files

        pip freeze > requirements.txt

        • Sören
          link
          fedilink
          02 years ago

          That’s not a lockfile. This would be the equivalent of package.json

          • bjorney
            link
            fedilink
            -22 years ago

            How is it not a lock file?

            package.json doesn’t contain the exact version number of all downstream dependencies, this does

            • @gornius@lemmy.world
              link
              fedilink
              02 years ago

              Lockfile contains exact state of the npm-managed code, making it reproducible exactly the same every time.

              For example without lockfile in your package.json you can have version 5.2.x. In your working directory, you use 5.2.1, however on repo, 5.2.2 has appeared, matching your criteria. Now let’s say a new bug appeared in 5.2.2.

              Now you have mismatched vendor code, that can make your code behave differently on your machine, and your coworker’s machine, making you hunt for bug that wasn’t even on your side.

              Lockfile prevents that by saving an actual state of vendor code.

              • bjorney
                link
                fedilink
                12 years ago

                Yes, which is EXACTLY like a pip freeze’d requirements.txt, storing the exact version of every package and downstream dependency you have installed

          • bjorney
            link
            fedilink
            8
            edit-2
            2 years ago

            Would that just create a list of the current packages/versions

            Yes, and all downstream dependencies

            without actually locking anything?

            What do you mean? Nothing stops someone from manually installing an npm package that differs from package-lock.json - this behaves the same. If you pip install -r requirements.txt it installs the exact versions specified by the package maintainer, just like npm install the only difference is python requires you to specify the “lock file” instead of implicitly reading one from the CWD

            • @SatyrSack@lemmy.one
              link
              fedilink
              3
              edit-2
              2 years ago

              As I understand, when you update npm packages, if a package/version is specified in package-lock.json, it will not get updated past that version. But running those pip commands you mentioned is only going to affect what version gets installed initially. From what I can tell, nothing about those commands is stopping pip from eventually updating a package past what you had specified in the requirements.txt that you installed from.

              • bjorney
                link
                fedilink
                22 years ago

                But running those pip commands you mentioned is only going to affect what version gets installed initially.

                I don’t follow. If my package-lock.json specifies package X v1.1 nothing stops me from manually telling npm to install package X v1.2, it will just update my package.json and package-lock.json afterwards

                If a requirements.txt specifies X==1.1, pip will install v1.1, not 1.2 or a newer version. If I THEN install package Y that depends on X>1.1, the pip install output will say 1.1 is not compatible and that it is being upgraded to 1.2 to satisfy package Y’s requirements. If package Y works fine on v1.1 and does not require the upgrade, it will leave package X at the version you had previously installed.

              • @rgalex@lemmy.world
                link
                fedilink
                32 years ago

                The behaviour you mention is from npm install, which will put the same exact version from the package-lock.json, if present. If not it will act as an npm update.

                npm update will always update, and rewrite the package-lock.json file with the latest version available that complies with the restrictions defined on the package.json.

                I may be wrong but, I think the difference may be that python only has the behaviour that package-lock.json offer, but not the package.json, which allows the developer to put constraints on which is the max/min version allowed to install.

                • Fushuan [he/him]
                  link
                  fedilink
                  English
                  22 years ago

                  If you want min-max behaviours you need to use wrappers like pipenv or jump into conda/mamba. Pip offers basic functionality because there are more advanced tools that the community uses for the more advanced use cases.

  • @waz@lemmy.world
    link
    fedilink
    62 years ago

    Getting into rust is still on my to-do list, otherwise I’ve no major problem with pip or npm. They both have their flaws, but both work well enough to do what I need them for. If I had to prefer one it would be pip simply to sustain my passionate hate for all things JavaScript.

  • spez
    link
    fedilink
    English
    52 years ago

    Pip has a good looking loading thingy though.

  • @gerryflap@feddit.nl
    link
    fedilink
    192 years ago

    This is why I use poetry for python nowadays. Pip just feels like something ancient next to Cargo, Stack, Julia, npm, etc.

  • Tekhne
    link
    fedilink
    72 years ago

    No one here has yet complained about Cocoapods and Carthage? I’m traumatized. Thank God for SwiftPM

    • Pxtl
      link
      fedilink
      English
      2
      edit-2
      2 years ago

      what’s wrong with nuget? I have to say I like the “I want latest” “no, all your dependencies are pinned you want to update latest you gotta decide to do it” workflow. I can think of some bad problems when you try to do fancy things with it but the basic case of “I just want to fetch my program’s dependencies” it’s fine.

      • Lucky
        link
        fedilink
        2
        edit-2
        2 years ago

        I’m guessing they only used it 10 years ago when it was very rough around the edges. It didn’t integrate well with the old .NET Framework because it conflicted with how web.config managed dependencies and poor integration with VS. It was quite bad back then… but so was .NET Framework in general. Then they rebuilt from the ground up with dotnet core and it’s been rock solid since

        Or they just hate Microsoft, which is a common motif to shit on anything Microsoft does regardless of the actual product.

        • Pxtl
          link
          fedilink
          English
          22 years ago

          Imho the VS integration has always been good, it’s the web config that’s always been a trash fire, and that’s not new.

          • Lucky
            link
            fedilink
            12 years ago

            The project I’m on right now originally had the nuget.exe saved in source because they had to manually run it through build scripts, it wasn’t built in to VS until VS2012

    • Lucky
      link
      fedilink
      92 years ago

      I’ve never had an issue with nuget, at least since dotnet core. My experience has it far ahead of npm and pip

      • @jubilationtcornpone@sh.itjust.works
        link
        fedilink
        English
        8
        edit-2
        2 years ago

        I’ll second this. I would argue that .Net Core’s package/dependency management in general is way better than Python or JavaScript. Typically it just works and when it doesn’t it’s not too difficult to fix.

        • @dan@upvote.au
          link
          fedilink
          22 years ago

          It’s also much faster to install packages than npm or pip since it uses a local package cache and each package generally only has a few DLL files inside.

          • @JakobDev@feddit.de
            link
            fedilink
            52 years ago

            Yes, but this file is created by you and not pip. It’s not like package.json from npm. You don’t even need to create this file.

            • Well if the file would be created by hand, that’s very cumbersome.

              But what is sometimes done to create it automatically is using

              pip freeze > requirements. txt

              inside your virtual environment.

              You said I don’t need to create this file? How else will I distribute my environment so that it can be easily used? There are a lot of other standard, like setup.py etc, so it’s only one possibility. But the fact that there are multiple competing standard shows that how pip handles this is kinds bad.

              • @JakobDev@feddit.de
                link
                fedilink
                English
                22 years ago

                If you try to keep your depencies low, it’s not very cumbersome. I usually do that.

                A setup.py/pyproject.toml can replace requirements. txt, but it is for creating packages and does way more than just installing dependencies, so they are not really competing.

                For scripts which have just 1 or 2 packges as depencies it’s also usuall to just tell people to run pip install .

              • @Vash63@lemmy.world
                link
                fedilink
                22 years ago

                I work with python professionally and would never do that. I add my actual imports to the requirements and if I forget I do it later as the package fails CI/CD tests.

          • Farent
            link
            fedilink
            62 years ago

            Isn’t it called a requirements.txt because it’s used to export your project requirements (dependencies), not all packages installed in your local pip environment?

        • @theFibonacciEffect@feddit.de
          link
          fedilink
          1
          edit-2
          2 years ago

          If newer versions are released and dependencies change you would still install the old dependencies. And if the dependencies are not stored you can’t reproduce the exact same environment.

    • @ExLisper@linux.communityOP
      link
      fedilink
      English
      02 years ago

      cargo just works, it’s great and everyone loves it.

      npm has a lot of issues but in general does the job. When docs say do ‘npm install X’ you do it and it works.

      pip is a mess. In my experience doing ‘pip install X’ will maybe install something but it will not work because some dependencies will be screwed up. Using it to distribute software is pointless.

      • @krimson@feddit.nl
        link
        fedilink
        212 years ago

        I use pip extensively and have zero issues.

        npm pulls in a million dependencies for even the simplest functionality.

        • qaz
          link
          fedilink
          42 years ago

          You’ve never had broken dependencies?

          • @krimson@feddit.nl
            link
            fedilink
            52 years ago

            Nope. I know mixing pip with python packages installed through your systems package manager can be a problem but that’s why I containerize everything.

            • qaz
              link
              fedilink
              12 years ago

              I separate everything in virtual environments myself, but in my opinion you shouldn’t need to that to simply avoid breaking your system.

        • @ExLisper@linux.communityOP
          link
          fedilink
          English
          -12 years ago

          It probably works for your own local project. After using it for couple of days to install some 3rd party tool my conclusion is that it has no idea about dependencies. It just downloads some dependencies in some random versions and than it never works. Completely useless.

    • TunaCowboy
      link
      fedilink
      372 years ago

      This is programmer humor, 95% of the people here still get defeated by semicolons, have never used a debugger, and struggle to exit vim.

      • Fushuan [he/him]
        link
        fedilink
        English
        142 years ago

        Sometimes I wish there was a community for more advanced users, where the concept of deciding on the best build tool chain per project is not a major hurdle. Venvs? Nbd. Pipenv? Nbd. Conda/mamba/micromamba? Nbd. Pure pip? Oh boy, I hope it a simple one, but I’ll manage. Maven? Fml, but sure. Npm? Sure. “Complex” git workflows, no problem.

        Idk, that’s just setting up the work environment, if your brains get squeezed by that I’m not sure if you will then be able to the actually code whatever its being asked of you. Some people…

        But yeah, this is a newbie space so I guess that we have to ignore some noise.

        • jelloeater
          link
          fedilink
          English
          12 years ago

          Seriously, I usually use Poetry these days for most projects, shit just works, build well and lets me distribute my code from PiPy just fine. Everything in one pyproject.yaml.

    • @ExLisper@linux.communityOP
      link
      fedilink
      English
      152 years ago

      This article someone linked is not 14 years old and it perfectly describes the mess python and pip are: https://chriswarrick.com/blog/2023/01/15/how-to-improve-python-packaging/

      My favorite part is:

      Most importantly: which tool should a beginner use? The PyPA has a few guides and tutorials, one is using pip + venv, another is using pipenv (why would you still do that?), and another tutorial that lets you pick between Hatchling (hatch’s build backend), setuptools, Flit, and PDM, without explaining the differences between them

      But yes, following old blog post is the issue.

      • @jjjalljs@ttrpg.network
        link
        fedilink
        32 years ago

        If you’re using a manually managed venv, you need to remember to activate it, or to use the appropriate Python.

        That really doesn’t seem like a big ask.

        I’ve been using python professionally for like 10 years and package management hasn’t really been a big problem.

        If you’re doing professional work, you should probably be using docker or something anyway. Working on the host machine is just asking for “it works on my machine what do you mean it doesn’t work in production?” issues.

        • @ExLisper@linux.communityOP
          link
          fedilink
          English
          -22 years ago

          No, actually most devs don’t use docker like that. Not java devs, not JS devs, not rust devs. That is because maven, npm and cargo manage dependencies per project. You use it for python exactly because pip does it the wrong way and python has big compatibility issues.

            • @NBJack@reddthat.com
              link
              fedilink
              42 years ago

              Friend, while I appreciate the time and effort on the docs, it has a rather tiny section on one of the truly worst aspects of pip (and the only one that really guts usability): package conflicts.

              Due to the nature of Python as an interpreted language, there is little that you can check in advance via automation around “can package A and package B coexist peacefully with the lowest common denominator of package X”? Will it work? Will it fail? Run your tool/code and hope for the best!

              Pip is a nightmare with larger, spawling package solutions (i.e. a lot of the ML work out there). But even with the freshest of venv creations, things still go remarkably wrong rather quick in my experience. My favorite is when someone, somewhere in the dependency tree forgets to lock their version, which ends up blossoming into a ticking time bomb before it abruptly stops working.

              Hopefully, your experiences have been far more pleasant than mine.

    • @barsoap@lemm.ee
      link
      fedilink
      1
      edit-2
      2 years ago

      The only time I ever interacted with python packaging was when packaging for nixos. And I can tell you that the whole ecosystem is nuts. You have like ten package managers each with thirty different ways to do things, none of which specify dependencies in a way that can be resolved without manual input because y’all have such glorious ideas as implementing the same interface in different packages and giving each the same name and such. Oh and don’t get me started on setup.py making http requests.

      • Scribbd
        link
        fedilink
        32 years ago

        If we talk about solutions: python has plenty. Which might be overwhelming to the user.

        I use Direnv to manage my python projects. I just have to add layout pyenv 3.12.0 on top and it will create the virtual environment for me. And it will set my shell up to use that virtual environment as I enter that directory. And reset back to default when I leave the directory.

        But you could use pipenv, poetry, pdm, conda, mamba for your environment management. Pip and python do not care.

    • I have to agree, I maintain and develop packages in fortrat/C/C++ that use Python as a user interface, and in my experience pip just works.

      You only need to throw together a ≈30 line setup.py and a 5 line bash script and then you never have to think about it again.

    • @Sprout4426@lemmy.world
      link
      fedilink
      12 years ago

      I really dislike pnpm, if everyrhing you do is install and build then if doesnt matter what you use, if you do anything complex pnpm will come back to bite you. Yarn is a good middle ground

      • Andrew
        link
        fedilink
        12 years ago

        You literally didn’t gave any arguments why you really dislike pnpm. The most obvious benefit is several times faster installations. It also have resolved some peer dependencies (I don’t remember details).

    • @olutukko@lemmy.world
      link
      fedilink
      32 years ago

      What’s the difference? I’m currently doing my web developement 2 course where we started using react so I’m typing npm to terminal all the time :D