Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If you are a linux user and own a nice camera you can use gphoto2 and ffmpeg to create a virtual camera. I posted howto on HN couple of days ago[0][1], here it is for anyone who might need it. I tried it with both Sony RX100VA and Sony A7III, in both cases it works really well.

edit: forgot to mention that this works over USB, you don't have to pay crazy markup for capture card

edit2: (because I'm so excited about getting this to work) here is a list of supported cameras[2] - sadly I was not able to get GoPro Hero 6 to work.

[0] https://www.crackedthecode.co/how-to-use-your-dslr-as-a-webc...

[1] https://news.ycombinator.com/item?id=23325143

[2] http://www.gphoto.org/proj/libgphoto2/support.php



Hey! I was trying to do this a while back, and couldn't make it work.

I followed the instructions in your post, and (Although it didn't work upfront) gave me the will to make it work ;)

Little advice: I fixed my set up by finding the correct v4l2 device, because video0 was already assigned. If you run:

v4l2-ctl --list-devices

it will tell you where v4l2 is plugged in your machine, in order to enter the correct command (That was the only part missing for my puzzle) as if you already have a webcam in your computer it will already /dev/video0 assigned, and the gphoto | ffmpeg piping gives too cryptic messages (It complains about the formats not being correct, while it should complain about it not being a v4l2 device)


Shame that there is a perfectly good standard for cameras... The USB Video Class... Why have all these cameras decided to go use a different set of protocols that don't work out of the box in any OS?


I'm guessing this is because proper live view video from a DSLR requires much higher bandwidth than USB can provide. The protocols used for remote capture and download over USB should be more standardized, for sure, but live view seems really hard.


It's actually much more because the DSLR companies, for the most part, are technologically-backwards, and don't get things like platforms, APIs, or similar. It's nineties-style closed thinking. It's a big part of why cell phones are now eating their lunch. I used to be a pretty serious photographer, and own probably $10,000 worth of camera equipment.

I mostly shoot with my cell phone these days, not because I mind spending money on cameras, but because it's a better device for most photography. It integrates with the world. Cameras integrate with their manufacturer's closed ecosystems.


USB Video class allows the device to provide a list of formats supported, and the host to choose one. That seems suitable for the host to manage bandwidth across its USB links, even if other devices are also using bandwidth.


I got a GoPro Hero 8 Black into OBS (Linux, but should work anywhere) by connecting to it using WiFi and its semi-documented api (https://github.com/KonradIT/gopro-py-api) to turn on its UDP video stream, then used that as the media source in OBS. I'm betting your Hero6 will work this way too.


Is this possible in MacOS at all? I have an RX100 V and an Elgato Camlink HD, but would love to use that capture card w another cam, and use the RX100 over USB simultaneously.


It works for Mac

  brew install gphoto2
  brew install ffmpeg --with-ffplay
  gphoto2 --abilities
  # Abilities for camera             : Sony Alpha-A6300 (Control)
  # ...
  gphoto2 --stdout --capture-movie | ffmpeg -i - -vcodec rawvideo -pix_fmt yuv420p -threads 0  -f matroska - | ffplay -
I'm piping it to ffplay,so this will at least let you test your camera or you could also use it in OBS as a window source. Also, make sure your cameras usb mode is not set to "mass storage" but to a “Remote Camera Control”.


Thanks for the tip, really appreciate the actual commands. I'm wondering if anyone else is running into this:

    $ brew install ffmpeg --with-ffplay
    Usage: brew install [options] formula

    # Install flags here, nothing about --with.
    Error: invalid option: --with-ffplay


Your right, they removed that option, https://formulae.brew.sh/formula/ffmpeg. I guess ffplay is built with it by default now.


can you try piping that to gstreamer?


Sure, not sure what gstreamer plugin/sink would create a loopback device, but this plays as well.

   gphoto2 --stdout --capture-movie | ffmpeg -i - -vcodec rawvideo -pix_fmt yuv420p -threads 0  -f matroska - | gst-launch-1.0 fdsrc fd=0 ! decodebin !  videoconvert ! videoscale ! autovideosink


per this - https://apple.stackexchange.com/a/356362

it should be,

  osxvideosink

I wonder if this works,

  gphoto2 --stdout --capture-movie | ffmpeg -i - -vcodec rawvideo -pix_fmt yuv420p -threads 0  -f matroska - | gst-launch-1.0 fdsrc fd=0 ! decodebin !  videoconvert ! videoscale ! osxvideosink


Looks like both gphoto2 and ffmepg are available on homebrew, worth giving it a shot for sure. FWIW - I did end up building my own ffmpeg because debian default didn't have NVIDIA support. Want me to give it a try?


I'd love it if you gave it a try! I've already spent enough hours re-compiling ffmpeg to get nvenc support :)


Spoke too soon, v4l2loopback-utils is the missing piece on mac. Found this with a basic search[0], if anyone enterprising enough wants to take a crack at it

https://apple.stackexchange.com/questions/353168/how-can-i-c...


Yes please! Would love to use my X-T30 as a webcam on my mac.


Cascable Pro Webcam claims to support the RX100 V:

https://cascable.se/pro-webcam/

Compatibility table:

https://cascable.se/help/compatibility/


Here's a simple gui app[0] that creates a syphon stream, and you can use that from an app called camera twist or possibly from OBS as well.

[0] https://github.com/v002/v002-Camera-Live


After going down this rabbit hole myself, I can't recommend this approach.

First of all, it works! And it's pretty cool to not need a capture card. But, for most cameras, you only receive at the resolution of the on-camera screen.

In other words, the video stream gphoto2 receives is intended for a camera remote preview screen. Check your camera's resolution before investing in this as a solution -- my very expensive 4k @ 60fps-capable mirrorless camera only produces a pretty poor 640x480 @ 50fps stream using gphoto2.

Additional video recording features like flicker reduction or IS seem to be lacking through this method as well.

In the meantime, I'm patiently awaiting the delivery of my 4k capture card :)


That's a dope trick. I can only wish there was a hack as easy as this to turn my iPhone into my Mac Mini's camera/microphone.

I've been looking for something that does this for a while now, WFH on the Mac Mini without a camera or mic is just dreadful :(


OBS does let you use an iPhone as a camera. (Not sure about mic though...)


I'm not sure which of the solutions you're referring to, but after Googling I found these three solutions:

- https://obs.camera/ (which seems to be the most promising)

- https://www.newtek.com/software/ndi-camera/ (mentioned in a forum)

- and EpocCam https://www.kinoni.com/ (from the answer below)

I'll leave these here for people who stumble upon this answer, but if you were talking about another alternative I'd love to know about it. For now I'll give OBS.Camera a go and see if I like it. Thanks!


From memory, the solution I used was “plug iPhone into USB port, select iPhone as a camera source in OBS”

Pretty sure it “just works”.


Look at EpocCam. Works for me.


Oh no... I just got a capture card to use with my A7 III :(

Will give this a shot, luckily I'm still in the return window.

E: Works great! Make sure to enable "PC Control" in settings for other Sony cameras. I had it set to USB Mass Storage (which is maybe the default?)


What card did you get?


Razer Ripsaw HD - it's apparently equivalent to an Elgato HD60S, but most importantly it's actually in stock :)


Before you return your card, keep in mind that gphoto2 will only receive at the resolution of your camera's display. For example, my camera can shoot at 4k, but gphoto2 only provides 640x480p.

Additionally, many camera features (flicker-reduction, electronic shutter options) are unavailable through this method.

This is really equivalent to a camera remote preview video, not intended to be used for actual video capture.


It is also possible to use the OBS output as a virtual webcam. All you need is v4l2loopback and the v4l2sink OBS plugin: https://github.com/CatxFish/obs-v4l2sink

It work perfectly and the virtual camera can be used with Jitsi, BigBlueButton and the likes :).


Hah, I just followed your guide after stumbling on it on Google yesterday to set up a Canon M50 on Linux Mint! It works incredibly well—heads and shoulders above the video quality from a webcam, and now that I'm piping video through ffmpeg there's tons of potential to do some weird stuff with filters and swapping to pre-recorded video.


I wish I can take credit for this, Ben Chapman did all the work, I just happened to google it :)

And yeah, 100% agree on video quality, it is so much better than what you get from that potato sensor on MBP


My Sony a6300 is supported, but I can't get it to work. Doesn't even show up on lsusb when I connect in PC remote mode, much less in gphoto2 --auto-detect. I'm stumped. Too bad, that would have been useful.


The USB mode has to be changed. Set it to “PC Remote” mode. https://helpguide.sony.net/gbmig/44840601/v1/eng/contents/TP...


Thanks. That's what I did. When set to PC Remote mode, the camera does not show up is lsusb or gphoto2. The other modes work, but don't support capture.


On the off chance that somebody stumbles on this: I got it to work in the end. After the initial experiments, I noticed the camera refused to charge from a bog-standard USB charger (with any cable). Removing and reinserting the battery restored charging functionality and, as it turns out, the camera also started being visible in PC remote mode. Serves me right for only using the regular on/off switch.


You should try another microUSB cable - my first microUSB cable did not work because it was charging-only.


Thanks. I tried several, but I suppose they could all be bad. Mass storage mode works, though.


Yet another way that my Panasonic GX-1 can't capture video :(. It has a mini HDMI port, so I thought I could do it there, but it doesn't do live-view over HDMI, just playback.


Do you have any idea if this works with the older GoPro models?

Plan to give it a go later tonight but wondering if anyone has any success stories.


This is what I found while I was trying to get GoPro to work with gphoto2[0] - would love to know if you do get it to work somehow.

[0]https://sourceforge.net/p/gphoto/mailman/message/36174298/



That table is why I posted the question.

It’s unclear if that’s the list of supported cameras or if it’s a list of cameras and only those with entries in the next two columns are supported.

Either way I can’t wait to get home and mess with it.


I have an old Canon Powershot ELPH 300 HS, with newer models on that support list. I'll have to try this anyway.


Where does audio come from? From DSLR? If you need a separate mic, is there a/v sync issue?


I use a USB mic, and so far I or anyone I've been on Google Meet with haven’t noticed any issues.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: