Automated time-lapse? (SOLVED & WORKING!!)
Posted: Wed Jun 19, 2024 7:30 am
I'm am wanting to create timelapse videos automatically. I have been using the manual export option on a per-clip basis, but I have a new project that requires the entire day of clips to be exported every day in a timelapse so that would be very time consuming.
I've seen some similar ideas with some searching, but not exactly what I'm trying to do, but it seems like it might be possible like this:
Define Storage Aux 2 as the second/final target with a clip age limit of 7 days and a delete action. Define Storage Aux 1 as an initial recording target with a clip age limit of 2 hours, Move to folder (Aux 2) action, Convert/Export Queue on move selected, configure re-encode video with Time-lapse multiplier of choice, Replace action for Where to save.
Then I would set the camera I want to do this for to use that AUX 1 storage and it will continuously create time-lapse exports for clips as they become 2 hours old. Does that sound right?
If so, great, because that's what I just set up as a test and we'll see what things look like in the morning. If there is a better/easier way though, I'd be glad to hear it!
Next I'll need to implement a script to use FFMPEG to combine the clips for each day. I wondered if I could just set BI to create one clip per day, but that seems dangerous to have such a large file created that is 24 hours long and then to have to export/convert that 24 hour long clip to a time-lapse in one shot.
For anyone interested in the rest of the process, after getting a single clip for each day, I'll use SyncBack Pro to upload the video to a specific folder in my Google Drive. Then I have Zapier automations set up to watch that folder and publish any video that shows up there to my YouTube channel. I tried to use the YT API to directly publish myself, but they've changed their API auth and it does not appear to be possible to use it "headless" at this point without being a large company with special arrangements in place (like Zapier evidently has).
I previously had the camera export a JPG image once per minute and upload that to a webserver where I would combine the images with FFMPEG and then the rest like I said above. This has been working, but the camera sometimes would glitch and stop creating export JPGs and even when it was running well, the resulting video was good, but not as beautiful as the exports from BI given the 20fps source material I have in BI versus 1 frame per minute with the JPG export method. I have some remote time-lapse cameras that will still have to use the 1 frame per minute method because I don't have BI available to handle those, but at least the cameras I have set up with BI can take advantage of the extra frames.
One example of a recent time-lapse export from BI of a particularly interesting storm can be found here: https://youtu.be/-ErxKpeUp8I You can see previous JPG-based exports if you browse that channel. If this new method works, I'll be upgrading my camera setup and adding at least one more view to better appreciate the storm motion from multiple directions.
I've seen some similar ideas with some searching, but not exactly what I'm trying to do, but it seems like it might be possible like this:
Define Storage Aux 2 as the second/final target with a clip age limit of 7 days and a delete action. Define Storage Aux 1 as an initial recording target with a clip age limit of 2 hours, Move to folder (Aux 2) action, Convert/Export Queue on move selected, configure re-encode video with Time-lapse multiplier of choice, Replace action for Where to save.
Then I would set the camera I want to do this for to use that AUX 1 storage and it will continuously create time-lapse exports for clips as they become 2 hours old. Does that sound right?
If so, great, because that's what I just set up as a test and we'll see what things look like in the morning. If there is a better/easier way though, I'd be glad to hear it!
Next I'll need to implement a script to use FFMPEG to combine the clips for each day. I wondered if I could just set BI to create one clip per day, but that seems dangerous to have such a large file created that is 24 hours long and then to have to export/convert that 24 hour long clip to a time-lapse in one shot.
For anyone interested in the rest of the process, after getting a single clip for each day, I'll use SyncBack Pro to upload the video to a specific folder in my Google Drive. Then I have Zapier automations set up to watch that folder and publish any video that shows up there to my YouTube channel. I tried to use the YT API to directly publish myself, but they've changed their API auth and it does not appear to be possible to use it "headless" at this point without being a large company with special arrangements in place (like Zapier evidently has).
I previously had the camera export a JPG image once per minute and upload that to a webserver where I would combine the images with FFMPEG and then the rest like I said above. This has been working, but the camera sometimes would glitch and stop creating export JPGs and even when it was running well, the resulting video was good, but not as beautiful as the exports from BI given the 20fps source material I have in BI versus 1 frame per minute with the JPG export method. I have some remote time-lapse cameras that will still have to use the 1 frame per minute method because I don't have BI available to handle those, but at least the cameras I have set up with BI can take advantage of the extra frames.
One example of a recent time-lapse export from BI of a particularly interesting storm can be found here: https://youtu.be/-ErxKpeUp8I You can see previous JPG-based exports if you browse that channel. If this new method works, I'll be upgrading my camera setup and adding at least one more view to better appreciate the storm motion from multiple directions.