There is a nvidia rtx 5000? How much does it cost?

    smilem yeah it's a Quadro RTX 5000. There's also a rtx 4000 too. I paid 300 for my 5000 but they usually are 500ish I think.

    Still haven't tested it yet. Going to test soon though. The monitor I was planning to use (up2715k dell) ended up having high modulation so didn't end up trying it. Highest 100% brightness it's low but too bright for me. Gonna test a few different monitors.

    degen what exact win build are you running with the a770 arc? Is it completely safe even non gaming ? I have this GPU just don't know which windows to load for testing on this GPU

    How much watts does these use IDLE? The Quadro 6000, not RTX. uses alllot and heats too when ide 65C or so.

    The Quadro card drivers does not work with gaming cards from Nvidia, why how that is even possible? The devs are idiots perhaps, hardly. Are they paid to do this? any hacks to install Quadro and RTX 3090 etc. ?

    Then there is incompatibility with nvidia driver versions, when cards are from different era like RTX 3090 and Quadro 6000 or Quadro 2000, one driver overwrites the other. Any workarounds?

      smilem

      I don't have a Quadro 6000. The only cards I have at the moment would be Quadro RTX 5000/4000, a770 Le, 2080 super, 1660 S, GTX 680, and Quadro a2000.

      I havent tried my PC yet I'm really waiting to get my eazeye monitor before I start testing anymore stuff. my eyes are so flared up right now. Can report back on idle wattage once I got a comfortable setup at least, I don't have a safe baseline at the moment.

      Quadro is the model card not driver, maybe try the rtx 3090 studio driver?

      There is a way where you can have a 3090 and another safe gpu installed.. so for example 3090 doing all of the leg work while the safe gpu is only used for display output. I haven't tried it but it works for others I think?

      smilem oh wait I re read what you sent. I see what your talking about hmmm I'm not sure how they do it tbh. I can probably look into it though because that's what I would do once I confirm a safe display output card for myself

      smilem Let's say you have a Quadro and a Geforce together on the same system. They will use the same nvidia driver. You need to mod the newer Geforce driver so that it recognises the Quadro(basically add it's hardware id in an .inf file). This can be done easily with a program called Nvcleanstall. The same is possible with a Quadro driver.

      Edit: If they are from the same generation they will both work on a Geforce driver without modding.

      NVIDIA Quadro 6000 is Fermi GPU that will not work with RTX 3090 etc, even 1660 Super is not compatible.

      You can use old driver for Quadro 6000 but it is not compatible with any up to date driver for RTX or 1660 Super.

      Anything like Quadro 6000 is very power inefficient, will heat and use more then 3090 idle. 70W at least. I would rather consider NVIDIA Quadro 2000, but its old too. Or perhaps I don't know secret how to make it work?

        smilem Didn't test it myself but I believe you need a Quadro Kepler as minimum to work with the latest modded drivers. A quick glance at the XtremeG modding subreddit shows 700 series as minimum.

        a month later

        karut Not yet. I've been flared up by other things so haven't been able to start testing my PC stuff yet unfortunately

        5 months later

        any more love for the INTEL ARC? Radeon Pro? Quadro? surely one of these is a good direction. My w5700 and k4000 seem promising, but windows 11 insists on managing them amd usually ends up crashing.

        Any chance these Intel Arc graphics cards are included in Intel's Programmer Reference Manuals (https://www.intel.com/content/www/us/en/docs/graphics-for-linux/developer-reference/1-0/overview.html)?

        Because if they are, we could probably check if temporal dithering is enabled on Linux. Which could explain the Linux eyestrain @degen mentioned. Instructions on how to do it can be found in this thread: https://ledstrain.org/d/1017-registers-to-control-intel-graphics-dithering-on-linux-located

        Edit: I think it's this one: https://www.intel.com/content/www/us/en/docs/graphics-for-linux/developer-reference/1-0/alchemist-arctic-sound-m.html
        Specifically: "Volume 2c: Command Reference: Registers Part 2", page 1056

        Please share your findings. I don't have an Intel Arc card so I can't further participate.

        2 months later

        I was looking to purchase an Intel Arc A750 Limited Edition and test with Windows 10, 11, and Linux. I have a PC that feels fine with Windows 10 Pro with a Nvidia 3080 TI. I have tried Windows 11 and many different versions of Linux on this same PC where all of them give me eyestrain, even though Windows 10 22H2 does not. I'm not entirely certain if dithering is my issue, but I would like to give the A750 a try in a 2nd PC that I built for the purpose of finding a combination of safe components to use Linux/Windows 11 in the future. On this forum, it seems that the A770 LE is the one that people are having success with but I wonder if the A750 LE would be almost identical in terms of the dithering stuff. Does anyone know?

        In terms of laptops, I saw there are some that have Intel Arc as the igpu. Could Ditherig work on that, like it does for the Intel Iris XE ones?

        dev