Nice work.

Are you using OpenCV for its video decoding, or are you doing any machine learning stuff? I've been wondering if there is a machine learning solution to finding the original dithering algorithm that's been applied.

  • JTL replied to this.

    Seagull Are you using OpenCV for its video decoding, or are you doing any machine learning stuff? I've been wondering if there is a machine learning solution to finding the original dithering algorithm that's been applied.

    Just for video decoding and static computation based on the frame pixel values, but I'm all ears 🙂

      Just to understand how this works, in a nut shell: it takes an already-existing video file, plays it back and makes changes visible by coloring them?
      So it could be used with any capture card and existing video files?

      The cheaper cards don't capture raw but use downsampling. Would temporal dithering still be visible? Would be great if there's a real cheap card anyone could afford.

      • JTL replied to this.

        KM Just to understand how this works, in a nut shell: it takes an already-existing video file, plays it back and makes changes visible by coloring them?

        It can also do a realtime input if the card cooperates, but yes

        KM The cheaper cards don't capture raw but use downsampling. Would temporal dithering still be visible? Would be great if there's a real cheap card anyone could afford.

        Unsure. I got the best card I could to make that less of an issue.

        JTL

        Its only wondering, I have no idea if its feasible or how to do it. Even if its not as absolute as reverse engineering the algorithm, it might be useful to quantify the degree of randomness. As I have pontificated before, perhaps a more random dithering pattern is less likely to create aggravating visual patterns. Afterall, my personal experience has been GPU dithering generally bad, monitor dithering generally ok. It would also be helpful to be able to determine if a GPU is capable of multiple dithering algorithms, and when they are used. Idle thoughts aside, I would like to learn OpenCV as it might be useful for my new job. This could be an interesting place to start.

        Anyway, back on topic. Looking forward to seeing how your findings compare with mine. I did not use OpenCV to decode, instead I used VLC player to screenshot each frame, and then decoded the .png files produced using a rando library I found. Be interesting to see if that makes a difference. You are of course welcome to my capture samples, though I'm not sure how to share them. They total 20GB now, and the internet here isn't amazing (0.4MB/s upload only).

        • JTL replied to this.

          Seagull You are of course welcome to my capture samples, though I'm not sure how to share them. They total 20GB now, and the internet here isn't amazing (0.4MB/s upload only).

          If you have a computer that can be left on to host them I have some idea. Feel free to email me (jtl at teamclassified dot ca)

            JTL Unfortunately I don't have a pc I can leave on. I might have a 1tb onedrive account that I can use to share. Will get back to you.

              Seagull I might have a 1tb onedrive account that I can use to share

              Still need to upload somehow? So I don't see how that solves that problem.

              Let's discuss this soon

              This looks great. How does a 'good' output look like using this s/w? Much less (or no) noise present?

              • JTL replied to this.

                diop This looks great. How does a 'good' output look like using this s/w? Much less (or no) noise present?

                Still early on the testing, but that's right

                Seagull Some kind of split utility could be used to break files into manageable chunks. (120M should take 5m to upload on a 0.4Mb connection)

                  Slacor

                  All on OneDrive now, anyone that wants them can give me their email. @JTL you should have an invite email?

                    Seagull Haha. I just saw the email without context this morning and thought it was a malicious email of some kind.

                    Thanks for the reassurance.

                    JTL Ha! not at all surprising. I pay for Google drive despite getting a huge 1TB of OneDrive for free. My biggest problem with it is that it often doesn't detect small changes to files resulting in them not being sync'ed, so its no good for code. I can share via Google Drive if needed, but you'll need to be able to download them fast as I don't have a huge amount of space to play with.

                    • JTL replied to this.

                      Seagull What's the total size?

                      I'll poke at it more later, but I'll let you know if I need the Google Drive download.

                      so its no good for code

                      My recommendation there is research git and have a private server if needed.

                        JTL Of my Google Drive? I've filled 77 out of 100GB, the samples total 20GB. So I have just enough space, but it won't be long before I need to remove it.

                        I have unlimited google drive. If it's uploaded to google drive and available, I could re-share it indefinitely.

                          Did you find anything unusual yet, for example with DirectWrite font rendering, Firefox Quantum, "Linux eye strain", or by plugging in a known-bad graphics card in BIOS mode?

                          Slacor I have my own servers that could be used at some point.

                            dev