date
int64
1,220B
1,719B
question_description
stringlengths
28
29.9k
accepted_answer
stringlengths
12
26.4k
question_title
stringlengths
14
159
1,495,379,997,000
I'd like to run ffmpeg -x11grab for a specified amount of time, sneding the output to a file. (This is on a Debian system, ffmpeg 7:4.0.2-1) I have already tried the -t switch, as below: ffmpeg -f x11grab -y -r 60 -video_size 1920x1080 -i :0.0 -t 10 -vf format=gray -pix_fmt yuv420p myfile but it won't stop after 10 seconds. Is there a way to do this? Thank you!
I would like to suggest the timeout command. I use it with ffmpeg to record a live HTTP stream. $ timeout --help Usage: timeout [OPTION] DURATION COMMAND [ARG]... Start COMMAND, and kill it if still running after DURATION. DURATION is a floating point number with an optional suffix: 's' for seconds (the default), 'm' for minutes, 'h' for hours or 'd' for days. It should come pre-installed on most linux distributions. For 10 seconds, simply run: $ timeout 10s ffmpeg -f x11grab -y -r 60 -video_size 1920x1080 -i :0.0 -vf format=gray -pix_fmt yuv420p myfile Plesae note that there is also another timeout, which adds the ability to limit by memory and CPU frequency too. As for your problem with ffmpeg, this is from man ffmpeg: SYNOPSIS ffmpeg [global_options] {[input_file_options] -i input_url} ... {[output_file_options] output_url} ... You need to move -t argument before -i.
Can I run ffmpeg -f x11grab for a specified amount of time?
1,495,379,997,000
I installed a package (don't remember which one but that shouldn't be too important) and needed "the real" ffmpeg so I purged the debian package for ffmpeg and installed the other version manually. The problem is that I now can't install or update any packages without aptitude telling me that xyz package requires ffmpeg which is not being installed. However, since the package is installed all the programs that depend on it work just fine. I don't want to overwrite the version I have with the repository version, I just want to tell aptitude that ffmpeg really is installed. I've tried apt-mark but because aptitude doesn't think it's installed it won't mark it manual. As requested apt-cache policy ffmpeg shows: ffmpeg: Installed: (none) Candidate: 7:3.2.10-1~deb9u1~bpo8+1 Version table: 10:2.6.9-dmo1 0 100 /var/lib/dpkg/status 7:3.2.10-1~deb9u1~bpo8+1 0 100 http://http.debian.net/debian/ jessie-backports/main amd64 Packages I'm still very much interested in a solution but I decided that I would just install the repository version and then reinstall the version I want. Aptitude did "install" ffmpeg but it didn't overwrite my current version (maybe it put it in a different spot or maybe it saw a "newer" version).
It is not recommended to install random packages with dpkg. Read the whole answer before taking any action Apt is having an issue because it has not associated with your install of ffmpeg. As far as it is concerned you do not have ffmpeg installed and is giving you the above error because of this. Your output of apt-cache policy ffmpeg proves this. I am not aware of methods to associate a random .deb file with apt unless your package sets up apt during its install process. If someone can correct me about this, I would greatly appreciated it. If you need a package that is not available in you current repositories, you can, at your own discretion, add testing/unstable/backports/whatever repository to your sources.list that has the package you are looking for and use the package manager. I do not recommend you use an unofficial Debian repo. Look up the available versions of ffmpeg here. If the version you want is in a repo other than stable you should then add it to your sources.list. If you set up "Apt Pinning" you can manage multiple releases of Debian with little overhead. To fix your installation of ffmpeg start by removing your current instance so we can start from scratch. apt-get purge ffmpeg dpkg --purge ffmpeg dpkg -l | ffmpeg This purges ffmpeg and then looks to see if it is installed. You should be looking for an rc status as per this post. You can then follow the guide in the wiki about apt pinning to add testing and unstable repos. Do not forget to create the files in /etc/apt/sources.list.d/ for each repo you plan on using, and a corresponding /etc/apt/preferences.d/ file. Once you have created the proper files to get set up with apt pinning you can then simply run: apt-get install ffmpeg/unstable To install ffmpeg from the unstable repo. You can substitute unstable with whatever repo your desired package is from if you set up apt pinning. However if you absolutely must use an unofficial Debian repo Here is a guide to add a third-party repo that contains the ffmpeg package you a probably looking for. If you decide to go with apt pinning you will need to create the repo (/etc/apt/sources.list.d/deb-multimedia.list) and preferences (/etc/apt/preferences.d/deb-multimedia.preferences) instead. Complete the following steps after purging ffmpeg in the manner described previously. nano /etc/apt/sources.list Add this line: deb http://www.deb-multimedia.org stretch main non-free Save and exit. Now follow these steps: apt-get update The “-oAcquire::AllowInsecureRepositories=true” option is mandatory since Buster and thus also needed for unstable. apt-get update -oAcquire::AllowInsecureRepositories=true apt-get install deb-multimedia-keyring -oAcquire::AllowInsecureRepositories=true Since Squeeze you can install this package with apt-get. Simply press y when the package ask what to do (don't press return). If apt-get can not find the new key, do that : wget http://www.deb-multimedia.org/pool/main/d/deb-multimedia-keyring/deb-multimedia-keyring_2016.8.1_all.deb sudo dpkg -i deb-multimedia-keyring_2016.8.1_all.deb You can verify the package integrity with: sha256sum deb-multimedia-keyring_2016.8.1_all.deb 9faa6f6cba80aeb69c9bac139b74a3d61596d4486e2458c2c65efe9e21ff3c7d deb-multimedia-keyring_2016.8.1_all.deb Final and mandatory step. An apt-get update and apt-get dist-upgrade to install all packages related to ffmpeg. After that you can run apt-get install ffmpeg and you should have the correct package. Conclusion I understand this covers a lot but I wanted to be comprehensive. My personal philosophy with administrating Debian is to not stray from stable if possible, not stray from official repos with properly set up apt pinning according to the Debian wiki, and to carefully limiting and managing anything from a .deb or installed from source. If you can help prevent there from being conflicts between versioning of packages/libraries by having everything you possibly can be managed by apt you should be safe. Sorry if this is too long winded or not what you needed I just wanted to make sure I covered all the bases. Best of Luck!
How do I tell aptitude that a package was installed manually?
1,495,379,997,000
I have a couple of radio plays of various series. Some are already single-track, some are multi-track. I want all to be single-track. I don't mind re-encoding; in fact I want to transfer them to my mobile device and prefer opus output. From inside a folder of a single audiobook, this seems to do the trick, converting mp3 to opus: ffmpeg -i "concat:$(ls *.mp3 | tr '\n' '|')" -acodec opus test.opus Now, I really have a lot of multi-track radio-plays that I want to convert. I'd like to define a function that I can use with either find or pipe the results from ls into. I fumbled about with variants of this: function audioconcat { folder=$1; iformat=$2; oformat=$3; echo $folder; echo $iformat; echo $oformat; ffmpeg -i \'concat:$(find "$folder" -name *.$iformat | tr '\n' '|' | tr ' ' '\ ' | head -c -1)\' -acodec $oformat \'$folder.$oformat\'; } So the idea is to look for files of the given input format inside the folder, give them into ffmpeg concatenate, encode the stream to the given output format and save it as a single file, using the folder name. However, I always seem to have problems with white spaces and/or my nested function calls. What can I do to fix my function? Alternatively, what better method is there to do the conversion as described above?
You have demonstrated multiple anti-patterns in your code which could be improved. See Why you shouldn't parse the output of ls(1). You don't need to parse the output of ls command and avoid using multiple shell pipe-lines with tr command and find. It is recommended better to use the glob options provided by the native shell which in your case should be the bash shell. The piece of code $(ls *.mp3 | tr '\n' '|') could very well be written in bash with the options it provide for file globbing as shopt -s nullglob mp3FileList=(*.mp3) This extended shell option is enabled to ensure empty glob result is skipped instead of being processed when populating into the array. You should cd into the folder and do below. Note the final | after the array, since you originally had it in the list too. Remove it if its not needed. fileString=$( IFS='|'; echo "${mp3FileList[*]}|" ) Now the above variable would contain the list of files in a | separated format with a | at the last which can then be passed to your ffmpeg command as ffmpeg -i "concat:${fileString}" -acodec opus test.opus Regarding your second requirement to pass multiple options to the script. You could extend this script to do audioConcat() { (( "$#" < 3 )) && { printf 'insufficient arguments supplied' >&2; exit 1 ; } cd "$1" || { printf 'unable to navigate to target\n' >&2; exit 2 ; } shopt -q nullglob; nullglob_set=$? ((nullglob_set)) && shopt -s nullglob local fileList local fileString fileList=(*."${2}") if (( ${#fileList[@]} )); then fileString=$( IFS='|'; echo "${fileList[*]}" ) ffmpeg -i "concat:${fileString}" -acodec "$3" "$1.$3" else printf 'unable to find files of extension %s\n' "$2" >&2 exit 3 fi ((nullglob_set)) && shopt -u nullglob } Remember when calling the function, pass your arguments as audioConcat '/path/to/mp3files/' 'mp3' 'opus' Would highly recommend you to comment out the ffmpeg line in the above function and see if the variables are created as needed before calling the actual command. Also confirm if you need the trailing | in your file list. Quick summary of the constructs used in the function The nullglob option set during pathname expansion would avoid expanding an empty glob, i.e. when no .mp3 files are found the array would be empty while expansion instead of an unexpanded glob $(IFS='|'; echo "${mp3FileList[*]}) is a neat trick to produce to print the output in | separated format. We are modifying the IFS in a sub-shell (the Input Field Separator) so it would not be modified globally. The array expansion with [*] would concatenate the string with the IFS value set. Some misc. notes to consider:- Using exit from in the function would actually exit the current shell you are running the function. It might not be recommended when using from command-line and more suitable when ran from a script with a proper interpreter she-bang set, in which case it would exit from the sub-shell launched to run the script. If you plan to use from the command line more often, replace the exit calls with return.
How can I convert a folder of audio files into a single file (iterate over many folders)?
1,495,379,997,000
I'm trying to convert an AAC file into WAV in order to pipe the output into LAME. I'm looking to do this, specifically: find . -maxdepth 1 -type f -iname "*.m4a" | sort | while read file; do ffmpeg -i "$file" -acodec pcm_s16le -ac 2 - | lame -b 256 -m s -q 0 - output.mp3 done I get the following error: Unable to find a suitable output format for 'pipe:' Warning: unsupported audio format Is there a way to specify the -acodec for the output? Reading the manpage now. I know that I can convert to MP3 within FFMPEG, but that's not what I want to do ;)
Thanks to the prompt from Miati's answer, I figured it out: ffmpeg -i file.m4a -f wav -acodec pcm_s16le -ac 2 - | \ lame -m s -b 320 -q 0 --replaygain-accurate - file.mp3 The format needs to be set when outputting to stdout.
Convert to WAV using FFMPEG for pipe into LAME?
1,495,379,997,000
I have several video files in a directory and I want to convert all of them into other video formats. Is there any way that I can convert all of them in just one go using FFMPEG. I mean without having to make a shell script for doing so.
The easiest way would be to use a for loop of your shell of choice. This task is so simple, you can just use the prompt, there's no need to create a shell script. Here is the one-liner as an example for the widely-used bash (and compatible): for i in *.mkv; do ffmpeg -i "$i" … ;done
How to convert a group of video files using FFMPEG?
1,495,379,997,000
I'm using a program which continuously writes MPEG-TS video data to a file while it's running. I'm expecting it to run continuously for many days. I want to use ffmpeg to transcode this video data live. So that the .mts file doesn't grow continuously until I run out of hard drive space, I'm trying to get the first program to write to a named pipe and for ffmpeg to read from that pipe. I tried doing ffmpeg -i /tmp/test.mts -c:v libx264 test.mp4 but it seems that ffmpeg quits once it reaches the end of the pipe, instead of waiting for new data. For example if I start the program, wait 30 seconds and then run ffmpeg, I'll only get ~50 seconds of video out. (30 seconds + the time it takes ffmpeg to catch up) I have managed to get it working by doing ffmpeg -i pipe:0 -c:v libx264 test.mp4 < /tmp/test.mts but this feels kind of hacky to me, using stdin to do this. Is there a way I can directly provide the named pipe as an input to ffmpeg and have it wait for new data once it reaches the end of the current data? Thanks!
Simply open that fifo for writing (and keep it open) from another place, too. Example: In a window: mkfifo /tmp/test.mts exec 7<>/tmp/test.mts ffmpeg -i /tmp/test.mts out.mp4 In another window: cat ... >/tmp/test.mts cat ... >/tmp/test.mts The idea is that a reader won't receive an EOF from a pipe until all processes which had it open for writing have closed it: $ mkfifo /tmp/fifo $ cat /tmp/fifo & [1] 26437 $ exec 7>/tmp/fifo $ echo yes >/tmp/fifo yes $ echo yes >/tmp/fifo yes $ echo yes >/tmp/fifo yes $ exec 7>&- $ [1]+ Done cat /tmp/fifo Without the exec 7>/tmp/fifo which keeps an open handle to the writing end of /tmp/fifo, the cat would've terminated after the first echo.
How can I stop ffmpeg from quitting when it reaches the end of a named pipe?
1,495,379,997,000
I am trying to download the ffmpeg package and all of its dependencies into a directory on my computer. I use this code to do it sudo apt-get download $(apt-rdepends ffmpeg|grep -v "^ ") It works for the most part until it runs into this: W: Can't drop privileges for downloading as file '/home/daslab/compression/downloaded/ffmpeg2/ffmpeg2theora_0.30-1_amd64.deb' couldn't be accessed by user '_apt'. - pkgAcquire::Run (13: Permission denied) What permission am I missing and how do I get it?
You’re not missing a permission, you’re giving apt-get too much privilege; drop the sudo: apt-get download $(apt-rdepends ffmpeg|grep -v "^ ") apt-get download runs fine as a normal user. (Technically, you could give the _apt user access to the target directory, but it’s simpler and better to drop sudo.)
What permission am I missing here?
1,495,379,997,000
I'm running: ffmpeg -i rtmp://localhost/test -crf 20 -t 00:10:00 ./video/hq/1503411993750.mp4 >> out.log 2>>error.log And expecting >> out.log 2>>error.log to result in stdout to out.log and stderr to error.log. When I tail both of these files during the process I get unexpected results. The contents of error.log seem to contain stdout. I get a constant stream of: frame=191 fps=190 q=26.0 size=229kB time=00:00:04.63 bitrate=404.1 frame=227 fps=149 q=26.0 size=273kB time=00:00:05.83 bitrate=382.8 frame=242 fps=120 q=26.0 size=288kB time=00:00:06.33 bitrate= 372.6 frame=258 fps=101 q=26.0 size=306kB time=00:00:06.86 bitrate= 365.2 frame=273 fps=89 q=26.0 size=324kB time=00:00:07.36 bitrate= 360.1 # ... continued Shouldn't the above be in out.log? Is the issue with how I've written the command or something unussual with ffmpeg?
Apparently all diagnostics messages in ffmpeg are sent to stderr, so the problem isn't syntax. -A normally running ffmpeg task seems to send all it's output (even when there are no errors) to STDERR even with no errors. This depends on what you mean with "output": ffmpeg sends all diagnostic messages (the "console output") to stderr because its actual output (the media stream) can go to stdout and mixing the diagnostic messages with the media stream would brake the output. From: https://lists.ffmpeg.org/pipermail/ffmpeg-user/2014-March/020605.html Another thing I'm unsure of. If the above is true shouldn't out.log contain video data, but instead it's always empty. (I guess this is because I've specified it to be sent to ./video/hq/1503411993750.mp4 instead)
ffmpeg command >> out.log 2>>error.log
1,495,379,997,000
I want to convert DAV format video clips from CCTV recorders to AVI using command as below (convert test - 1.dav to test - 1.avi) ffmpeg -y -i test\ -\ 1.dav -vcodec copy -movflags +faststart test\ -\ 1.avi It works properly on x86_64 Linux with ffmpeg ver 2.4.13 and Synology with ffmpeg ver 2.0.2. File after converting is fully playable on Windows, Android, iPad and Linux. But when I try to convert it on Synology equipped with the newest version of software, then fails (ffmpeg ver 2.7.1). Below output from conversion process: ffmpeg version 2.7.1 Copyright (c) 2000-2015 the FFmpeg developers built with gcc 4.9.3 (crosstool-NG 1.20.0) 20150311 (prerelease) configuration: --prefix=/usr --incdir='${prefix}/include/ffmpeg' --arch=arm --target-os=linux --cross-prefix=/usr/local/arm-unknown-linux-gnueabi/bin/arm-unknown-linux-gnueabi- --enable-cross-compile --enable-optimizations --enable-pic --enable-gpl --enable-shared --disable-static --enable-version3 --enable-nonfree --enable-libfaac --enable-encoders --enable-pthreads --disable-bzlib --disable-protocol=rtp --disable-muxer=image2 --disable-muxer=image2pipe --disable-swscale-alpha --disable-ffserver --disable-ffplay --disable-devices --disable-bzlib --disable-altivec --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libmp3lame --disable-vaapi --disable-decoder=amrnb --disable-encoder=zmbv --disable-encoder=dca --disable-encoder=ac3 --disable-encoder=ac3_fixed --disable-encoder=eac3 --disable-decoder=dca --disable-decoder=eac3 --disable-decoder=truehd --cc=/usr/local/arm-unknown-linux-gnueabi/bin/arm-unknown-linux-gnueabi-ccache-gcc libavutil 54. 27.100 / 54. 27.100 libavcodec 56. 41.100 / 56. 41.100 libavformat 56. 36.100 / 56. 36.100 libavdevice 56. 4.100 / 56. 4.100 libavfilter 5. 16.101 / 5. 16.101 libswscale 3. 1.101 / 3. 1.101 libswresample 1. 2.100 / 1. 2.100 libpostproc 53. 3.100 / 53. 3.100 Input #0, h264, from 'test - 1.dav': Duration: N/A, bitrate: N/A Stream #0:0: Video: h264 (Main), yuv420p, 704x576, 25 fps, 25 tbr, 1200k tbn, 50 tbc Output #0, avi, to 'test - 1.avi': Metadata: ISFT : Lavf56.36.100 Stream #0:0: Video: h264 (H264 / 0x34363248), yuv420p, 704x576, q=2-31, 25 fps, 25 tbr, 50 tbn, 50 tbc Stream mapping: Stream #0:0 -> #0:0 (copy) Press [q] to stop, [?] for help [avi @ 0xb0ff0] H.264 bitstream malformed, no startcode found, use the video bitstream filter 'h264_mp4toannexb' to fix it ('-bsf:v h264_mp4toannexb' option with ffmpeg) av_interleaved_write_frame(): Invalid data found when processing input frame= 1 fps=0.0 q=-1.0 Lsize= 6kB time=00:00:00.02 bitrate=2300.0kbits/s video:40kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown Conversion failed! I have to mention that in each case, I use the same input file and the same ffmpeg syntax for conversion. The only differences are versions of ffmpeg. What is most strange, the newest version works wrong. What mistake/error I am doing? Any idea?
Newer ffmpeg version refuses to mux H.264 encoded video without startcodes to AVI. You should use -bsf:v h264_mp4toannexb as it is indicated in ffmpeg output. This will not reencode your video.
DAV to AVI conversion failed
1,495,379,997,000
When trying to convert either an mp3 or flac file to ogg, the output ogg file is actually a flac file with a large file size. For instance: running for file in *.mp3; do ffmpeg -i "${file}" "${file/%mp3/ogg}"; done and then checking the file with mediainfo output.ogg gives: General Complete name : 06 - Magma.ogg Format : Ogg Format/Info : Free Lossless Audio Codec File size : 47.0 MiB Duration : 6mn 42s Overall bit rate mode : Variable Overall bit rate : 980 Kbps Audio ID : 1238237382 (0x49CE00C6) Format : FLAC Format/Info : Free Lossless Audio Codec Duration : 6mn 42s Bit rate mode : Variable Channel(s) : 2 channels Channel positions : Front: L R Sampling rate : 44.1 KHz Bit depth : 16 bits Writing library : Lavf57.37.100 The input mp3 in my case was 6 megabytes, but the ogg file is for some reason 47 megabytes. Why is this happening, and is there a why to convert files to ogg without this happening?
As you can see in the output, you encoded your audio into Format : FLAC. This is a format with lossless compression. ogg is just a container, and can hold different formats. To keep a similar size and quality as your mp3 you can choose the more usual vorbis format explicitly: ffmpeg -i in.mp3 -c libvorbis out.ogg The fact that it did not do this by default may mean you need to install a libvorbis package, depending on your system.
Converting files to OGG with FFMPEG produces an extremely large file
1,495,379,997,000
I am working on a project in which the user can upload videos. Is there any way with FFmpeg I can take some images from the video and create a GIF out of it? As the project is in Java, I have a way to get an image from a video, but to create a GIF requires multiple images, and it's proving costly. The server is running a Debian X64 system, so if FFMpeg is not suitable, I am open to other tools on Linux which can do this efficiently.
I do scene extracting from videos using vlc for linux. If you don't have it, use apt-get install vlc to install it. Once installed, you can use a variance of the following command line to extract frame(s) out of your video. The default image format is png and it is good for my purpose. If you insist on gif images, I suggest installing imagemagick for image format conversions. Here is the command that extracts the frames: cvlc ${videofile} \ --video-filter=scene \ --vout=dummy \ --start-time=${start-sec} --stop-time=${end-sec} \ --scene-ratio=1 --scene-prefix=${prefix} \ --scene-path=${MyStorePath} \ vlc://quit where videofile is an mp4 format video. Other formats might be possible but didn't test start-sec is where you want your frame grab starts from, in seconds from the beginning end-sec is where you want your frame grab ends, in seconds from the beginning. Must be greater than start-sec prefix is the prefix of the file names for captured images. MyStorePath is the path where you want to store captured images. And this command helps you figure out the video length: ffmpeg -i ${vidfile} 2>&1 | grep Duration | cut -d ' ' -f 4 | sed s/,// output is in HH:MM:SS.xx format. to convert this into video length in seconds, I use l=$(ffmpeg -i ${vidfile} 2>&1 | grep Duration | cut -d ' ' -f 4 | sed s/,//) h=$(echo $l|cut -d: -f1) m=$(echo $l|cut -d: -f2) s=$(echo $l|cut -d: -f3|cut -d"." -f1) (( hh=$h*3600 )) (( mm=$m*60 )) (( LengthSeconds=$hh+$mm+$s )) at this point you can manipulate the LengthSeconds variable to automatically determine start and end times. Unfortunately, for my vlc command to work, you have to specify a time slice to extract frames from.
FFMpeg : Converting a video file to a gif with multiple images from video
1,495,379,997,000
I have various video files as MKV, including some high-def (1080p) with FLAC audio. These can be played fine on a several-years-old PC with a mid-range graphics card (using mpv/ffmpeg), but when I tried to play them on a Kindle Fire HD8 (using VLC for Android) it caused it to choke. How can I reencode them such that the lower-powered machine can play them? I assume this will lead to a decline in quality, but I doubt I would notice on the smaller screen anyway. I already have ffmpeg installed and hope to just use that, but if it's easier I can install some other tool. Presumably there are multiple tradeoffs to be made here; it would be nice to have some idea of the options. (I can also post more details about the precise encoding of the files in question, if that would be useful; at the moment I don't know what would be relevant.)
I ended up doing something like: ffmpeg -i input-file.mkv -vcodec h264 -s:v 1280x800 -acodec copy output-file.mkv Of note: using -vcodec copy doesn't work, since that bypasses the decode/encode altogether and thus doesn't allow applying filters. Downscaling the video to this degree ended up shrinking the files dramatically and solving the problems with performance. I also tried further restricting the bitrate with -b, but that caused a noticeable quality hit while not making any difference to playback performance.
How best to reencode video for lower-performance playback?
1,495,379,997,000
I wish i can find ffmpeg help here at stackexchange. I'd been re-encoding old videos to libx264 to save up some storage space, as I thought it should be working based on documentation seems failing to me. I'd been using the snippet below to re encode all files: ffmpeg -i "$file" -y -acodec copy -vcodec libx264 -scodec copy -threads 12 -x264-params keyint=240:min-keyint=20 -profile:v baseline -level 3.0 "$output" the -acodec copy i believed should be able to map all audio streams and just copy them all to the new container right? but I can't understand why all those dual audio videos are being re-encoded to a single audio video, the non-default audio stream was omitted/removed. any idea how can I copy all codecs audio, subtitle, but only the vcodec to be re-encoded to libx264? thanks!
ERROR: type should be string, got "\nhttps://trac.ffmpeg.org/wiki/How%20to%20use%20-map%20option#Example4\n-map 0 would map all the streams, then keep them as it is then just state which codec to re-encode.\ni should read more.\n"
ffmpeg -acodec copy does not copy all audio stream to the new container
1,422,617,062,000
I want to generate a video with the exact same settings as an existing video. I can run: ffmpeg -i original.mp4 To get human-readable info about the contents: Duration: 00:05:32.32, start: 0.000000, bitrate: 474 kb/s Stream #0.0(und): Video: h264, yuv420p, 480x360, 25 tbr, 25 tbn, 50 tbc Stream #0.1(und): Audio: aac, 44100 Hz, stereo, s16 But what I'd like is to get the command-line I'd need to convert another video into the exact same format, so something like: ffmpeg -i source.mp4 -vcodec h264 - .... Any ideas if such a thing is possible?
There is no built-in option to do this, so you would need to write a program to do it. You need to parse the output of ffmpeg -i. Then you need to build a string containing all of the relevant information, formatted as a command line. It would need to know how to handle any properties that concern you. As Graeme noted, some format options are reported differently from the command you need to make them (and these often depend on the specific component that you have installed). Occasionally the completed script would need to be updated because of changes in ffmpeg. It's definitely possible, but it would be difficult and time-consuming. It might not be worth doing. If you still want to try it, you'll probably need experience in these areas: regular expressions media encoding in general ffmpeg options and components (I recommend compiling it yourself if you haven't already) (probably) a good dynamic programming language (PERL seems like a logical candidate, if you can understand it) By the time you finish, you might find that you prefer to type in your own command lines anyway.
Can I get the ffmpeg command-line from an existing video?
1,422,617,062,000
I'm struggling with ffmpeg. My webcam can do 720p at 30fps, but only when using the MJPEG codec: ~> v4l2-ctl --list-formats-ext ioctl: VIDIOC_ENUM_FMT Index : 0 Type : Video Capture Pixel Format: 'YUYV' Name : YUV 4:2:2 (YUYV) -- cut -- Size: Discrete 1280x720 Interval: Discrete 0.133 s (7.500 fps) Interval: Discrete 0.200 s (5.000 fps) -- cut -- Index : 1 Type : Video Capture Pixel Format: 'MJPG' (compressed) Name : MJPEG -- cut -- Size: Discrete 1280x720 Interval: Discrete 0.033 s (30.000 fps) Interval: Discrete 0.040 s (25.000 fps) Interval: Discrete 0.050 s (20.000 fps) Interval: Discrete 0.067 s (15.000 fps) Interval: Discrete 0.100 s (10.000 fps) Interval: Discrete 0.200 s (5.000 fps) -- cut -- I can't figure out how to tell ffmpeg to read the MJPEG compressed format. It is trying to read the raw variant, which ends up with horrible quality.
You don't say what options you're using but I did find these 2 examples. Do these work for you? ffmpeg -i <input_file> -vcodec mjpeg -qmin 1 -qmax 1 -o <output_file.avi> ffmpeg -i <input_file> -vcodec mjpeg -qscale 1 <output_file.avi> For the second example, I found a note that mentioned that the -qscale ... switch made a noticeable difference. The lower your x, the better is the quality (range 2 to 32, fractions allowed).
Recording a webcam using ffmpeg
1,422,617,062,000
I have a mix of MP4 files some of them only have audio data while the others have video and audio data. I want to find a way to convert the ones that don't have any video data to MP3 without checking each file one by one.
Run on an input to check if it has video ffmpeg -i INPUT -map v -vframes 1 -c copy -f null - The exit code will be 0, if it has video.
How to tell if a MP4 has video data and then convert it to an MP3 if there is only audio data in the file
1,422,617,062,000
I need to split wav files into multiple 10-second-long wav files, but each resulting wav file must be exactly 10 seconds in length, adding silence if needed – so if a wav file's duration in seconds isn't a multiple of 10, the last wav file should be padded with silence. I've seen some answers (1, 2, 3) which show how to use sox and ffmpeg to split a file into chunks of equal length: $ ffmpeg -i file.wav -f segment -segment_time 10 -c copy out%03d.wav $ sox file.wav output-.wav trim 0 10 : newfile : restart but the last file produced by these commands is usually less than 10 seconds long. Is there a way to split a wav file, padding the last file if needed, in the same command?
Split the files, inspect the resulting files (for i in *wav), if (length < 10 seconds), pad them. To get the wave file length: sox --info -D file.wav To pad the wave file: https://superuser.com/questions/579008/add-1-second-of-silence-to-audio-through-ffmpeg Maybe do some calculations :-)
split wav file into parts of equal duration, padding with silence if needed
1,422,617,062,000
Now I am writing a script for a long time, but lately this problem has drives me crazy. I tried everything but couldn't solve it. find . -iname "*.mp4" -type f -exec ffmpeg -i "{}" -c:a "$ACODEC" -c:v "$VCODEC" -vf \ "subtitles={}.$SUBEXTE:'force_style=fontsize=$FSIZE,fontname=$FNAME'" \ "{}.$EXTE" -hide_banner \; The idea is whenever find finds a mp4 file, do this FFmpeg commands. But when I want to try something more complex, like command substution for last variable {}.$EXTE, I have to use this syntax: find . -iname "*.mp4" -type f -exec sh -c 'ffmpeg -i "$1" -c:a "$2" -c:v "$3"\ -vf "subtitles=$1.$4:'force_style=fontsize=$5,fontname=$6'" \ "$1.$7" -hide_banner' -- "{}" "$ACODEC" "$VCODEC" "$SUBEXTE" "$FSIZE" "$FNAME" "$EXTE" \; As you see in this syntax when I use sh -c, I need to add extra single quotes ' '. But this broke the FFmpeg code. I tried almost every possible variations; adding and removing single-double quotes and even tried backslashes. Please help me! Some of the errors I got: Unable to open ./sub.srt [AVFilterGraph @ 0x7fffdb7d35e0] Error initializing filter 'subtitles' with args './sub.srt:force_style=fontsize=' Error reinitializing filters! Failed to inject frame into filter network: No such file or directory Error while processing the decoded data for stream #0:0 Conversion failed! And sometimes I got an error like not suitable format for fontname
A single quoted string can never contain a single quote. A solution to this conundrum would in your case be to replace each internal ' in your in-line sh -c script with '\'' (or '"'"'). What this does is to temporarily break out of the single quoted string (the first ' in '\'' ends the single quoted string), insert a literal ' (the \' in '\'', which you could also write as "'", but that is far too many quotes and makes it unreadable), and then concatenate that with the remaining original single quoted string (the final ' in '\'' starts a new single quoted string). The sh -c command would then look like sh -c ' ffmpeg -i "$1" -c:a "$2" -c:v "$3" \ -vf "subtitles=$1.$4:'\''force_style=fontsize=$5,fontname=$6'\''" \ "$1.$7" -hide_banner' ...arguments as before ...
Single quote problem in "sh -c" script launched from "find"
1,422,617,062,000
I have loads of videos named different things and different extensions (some mp4 some wmv etc). I would like to run the command below but on every video in a specific directory and then save that in another directory, no overwrites of original files. Some videos maybe duplicates, if that happens it should pause the script. For example there maybe video1.mp4 & video1.avi both can't be video1.mp4 in /home/videos/processedso the script should pause. ffmpeg -i '/home/videos/unprocessed/vidabc.mp4' -f mp4 -vcodec libx264 -preset fast -profile:v main -acodec aac -movflags +faststart '/home/videos/processed/vidabc.mp4' ffmpeg -i '/home/videos/unprocessed/vidxyz.mp4' -f mp4 -vcodec libx264 -preset fast -profile:v main -acodec aac -movflags +faststart '/home/videos/processed/vidxyz.mp4'
You can process every file in a directory using the find command with the -execdir flag. Example: find /home/videos/unprocessed -type f \ -execdir ffmpeg -i '{}' -f mp4 -vcodec libx264 -preset fast \ -profile:v main -acodec aac -movflags +faststart '/home/videos/processed/{}.mp4' \; -type f indicates that you want to find only files, not directories. After the -execdir flag, '{}' is replaced with the path of the file, e.g. ./vidabc.mp4. At the end of the -execdir command, include a \; to terminate the command. Note that in this example, your resulting filenames might look like: /home/videos/processed/vidabc.mp4.mp4 /home/videos/processed/vidxyz.avi.mp4 /home/videos/processed/vidxyz.mp4.mp4 If that is a problem, there are ways to address that. See Command “find” -exec replacing string for a similar example.
ffmpeg mass transcode videos in directory
1,422,617,062,000
I'm having trouble trying to build a static binary of ffmpeg - I've got almost the whole build working, with the exception of two libs - libvorbis and libmp3lame. These two libs are failing during ./configure, specifically on undefined functions from the math.h / libm: libvorbis: gcc -L/vol/build/lib -static -static-libstdc++ -static-libgcc -Wl,--as-needed -Wl,-z,noexecstack -I/vol/build/include -L/vol/build/lib -o /tmp/ffconf.UKKLGhCv/test /tmp/ffconf.UKKLGhCv/test.o -lvorbis -lm -logg -lstdc++ -lpthread -lexpat -ldl -lm --enable-libopencore-amrnb /vol/build/lib/libvorbis.a(envelope.o): In function `_ve_envelope_init': envelope.c:(.text+0x983): undefined reference to `_ZGVbN2v_sin' envelope.c:(.text+0x9a9): undefined reference to `_ZGVbN2v_sin' /vol/build/lib/libvorbis.a(lsp.o): In function `vorbis_lsp_to_curve': lsp.c:(.text+0x650): undefined reference to `_ZGVbN2v_cos' lsp.c:(.text+0x669): undefined reference to `_ZGVbN2v_cos' libmp3lame: gcc -L/vol/build/lib -static -static-libstdc++ -static-libgcc -Wl,--as-needed -Wl,-z,noexecstack -o /tmp/ffconf.dC4w1f5B/test /tmp/ffconf.dC4w1f5B/test.o -lmp3lame -lm -lstdc++ -lpthread -lexpat -ldl -lm --enable-libopencore-amrnb /vol/build/lib/libmp3lame.a(psymodel.o): In function `init_s3_values': psymodel.c:(.text+0x14d3): undefined reference to `_ZGVbN2v___exp_finite' psymodel.c:(.text+0x14fa): undefined reference to `_ZGVbN2v___exp_finite' /vol/build/lib/libmp3lame.a(psymodel.o): In function `psymodel_init': psymodel.c:(.text+0xb62d): undefined reference to `_ZGVbN4vv___powf_finite' psymodel.c:(.text+0xb677): undefined reference to `_ZGVbN4vv___powf_finite' psymodel.c:(.text+0xb6c4): undefined reference to `_ZGVbN4vv___powf_finite' psymodel.c:(.text+0xb711): undefined reference to `_ZGVbN4vv___powf_finite' psymodel.c:(.text+0xb75b): undefined reference to `_ZGVbN4vv___powf_finite' /vol/build/lib/libmp3lame.a(psymodel.o):psymodel.c:(.text+0xb7a2): more undefined references to `_ZGVbN4vv___powf_finite' follow /vol/build/lib/libmp3lame.a(util.o): In function `fill_buffer': util.c:(.text+0x28a6): undefined reference to `_ZGVbN2v_cos' util.c:(.text+0x28cc): undefined reference to `_ZGVbN2v_cos' util.c:(.text+0x28fb): undefined reference to `_ZGVbN2v_cos' util.c:(.text+0x2921): undefined reference to `_ZGVbN2v_cos' util.c:(.text+0x29cc): undefined reference to `_ZGVbN2v_sin' util.c:(.text+0x29e8): undefined reference to `_ZGVbN2v_sin' I can't figure out how to get these to sucessfully build. From what I understand, passing the -lm option should be enough, but apparently isn't. I checked for the presence of libm.a, which is located at /usr/lib/x86_64-linux-gnu/libm.a, I also tried to pass this directory in the -L flags, but no difference. The libs build fine when removing the -static flag, but the resulting binary is (duh) linked against libm.so. Just in case, these are the flags I'm using to build the two libraries: libvorbis: ./configure --prefix=${CMAKE_BINARY_DIR} --disable-shared --disable-oggtest libmp3lame: ./configure --prefix=${CMAKE_BINARY_DIR} --disable-shared I'd appreciate any pointers on how to fix or debug this any further. Edit: after playing around with it some more, it seems like the libm is getting linked in - when I remove the -lm flag, I'm getting a ton more undefined references - sin, cos, __pow_finite, etc. When I put it back in, most of these go away and only those mangled symbols, such as _ZGVbN4vv___powf_finite and _ZGVbN2v_cos remain.
Well, I managed to solve it - googling the mangled symbols such as _ZGVbN2v_cos led me to this patch mentioning vector math, and in combination with ldd's output during dynamic linking mentioning libmvec, I realized that I might have to link that in as well. For libmp3lame, it has to be linked in before libm: gcc -L/vol/build/lib -static -o /tmp/ffconf.dC4w1f5B/test /tmp/ffconf.dC4w1f5B/test.o -lmp3lame -lmvec -lm For libvorbis, the order of -lm and -lmvec doesn't matter, it builds either way.
Error during static build of libvorbis and libmp3lame
1,422,617,062,000
I've got an embedded device running without monitor, running Debian Jessie. Since I don't need a UI, I considered cleaning up the X11 packages. This gave a somewhat unexpected result: sudo -u nobody apt-get remove '^x11' -s This produces the following result: The following packages will be REMOVED: ffmpeg libavdevice57 libavfilter6 ... libx11-dev libxau-dev ... The ffmpeg that would be removed is this version from jessie-backports. That was not intended. The libx11-dev strictly speaking doesn't match '^x11' but I can explain that as an automatically installed package being auto-removed. But ffmpeg is manually installed (as confirmed by apt-mark showmanual). What is the link between packages named ^x11 and ffmpeg that causes this? I've also tried sudo -u nobody apt-get remove '^vnc' -s and sudo -u nobody apt-get autoremove Neither affects ffmpeg; it's not some orphan package that gets auto-removed regardless. It is specifically tied to X11.
I haven't traced the complete dependency tree, but the linked package has at least the following chain of dependencies: ffmpeg depends on libsdl2, which in turn depends on libxss1, which in turn depends on x11-common. Since x11-common matches ^x11, it is removed, breaking a dependency of ffmpeg. Thus, ffmpeg has to be removed. Assuming this is the only such chain, you should be able to keep ffmpeg by ensuring that x11-common is not removed.
Why is ffmpeg removed as part of x11?
1,422,617,062,000
I've been searching for hours now and trying various methods of installing FFMPEG on my CentOS server. This is what I have currently installed on my Ubuntu desktop: FFmpeg version 0.6.2-4:0.6.2-1ubuntu1.1, Copyright (c) 2000-2010 the Libav developers built on Sep 16 2011 17:00:39 with gcc 4.5.2 configuration: --extra-version=4:0.6.2-1ubuntu1.1 --prefix=/usr --enable-avfilter --enable-avfilter-lavf --enable-vdpau --enable-bzlib --enable-libgsm --enable-libschroedinger --enable-libspeex --enable-libtheora --enable-libvorbis --enable-pthreads --enable-zlib --enable-libvpx --disable-stripping --enable-runtime-cpudetect --enable-vaapi --enable-gpl --enable-postproc --enable-swscale --enable-x11grab --enable-libdc1394 --enable-shared --disable-static WARNING: library configuration mismatch libavutil configuration: --extra-version=4:0.6.2-1ubuntu2+medibuntu1 --prefix=/usr --enable-avfilter --enable-avfilter-lavf --enable-vdpau --enable-bzlib --enable-libdirac --enable-libgsm --enable-libopenjpeg --enable-libschroedinger --enable-libspeex --enable-libtheora --enable-libvorbis --enable-pthreads --enable-zlib --enable-libvpx --disable-stripping --enable-runtime-cpudetect --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-version3 --enable-vaapi --enable-libopenjpeg --enable-libfaac --enable-nonfree --enable-gpl --enable-postproc --enable-swscale --enable-x11grab --enable-libfaad --enable-libdirac --enable-libfaad --enable-libmp3lame --enable-librtmp --enable-libx264 --enable-libxvid --enable-libdc1394 --enable-shared --disable-static libavcodec configuration: --extra-version=4:0.6.2-1ubuntu2+medibuntu1 --prefix=/usr --enable-avfilter --enable-avfilter-lavf --enable-vdpau --enable-bzlib --enable-libdirac --enable-libgsm --enable-libopenjpeg --enable-libschroedinger --enable-libspeex --enable-libtheora --enable-libvorbis --enable-pthreads --enable-zlib --enable-libvpx --disable-stripping --enable-runtime-cpudetect --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-version3 --enable-vaapi --enable-libopenjpeg --enable-libfaac --enable-nonfree --enable-gpl --enable-postproc --enable-swscale --enable-x11grab --enable-libfaad --enable-libdirac --enable-libfaad --enable-libmp3lame --enable-librtmp --enable-libx264 --enable-libxvid --enable-libdc1394 --enable-shared --disable-static libswscale configuration: --extra-version=4:0.6.2-1ubuntu2+medibuntu1 --prefix=/usr --enable-avfilter --enable-avfilter-lavf --enable-vdpau --enable-bzlib --enable-libdirac --enable-libgsm --enable-libopenjpeg --enable-libschroedinger --enable-libspeex --enable-libtheora --enable-libvorbis --enable-pthreads --enable-zlib --enable-libvpx --disable-stripping --enable-runtime-cpudetect --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-version3 --enable-vaapi --enable-libopenjpeg --enable-libfaac --enable-nonfree --enable-gpl --enable-postproc --enable-swscale --enable-x11grab --enable-libfaad --enable-libdirac --enable-libfaad --enable-libmp3lame --enable-librtmp --enable-libx264 --enable-libxvid --enable-libdc1394 --enable-shared --disable-static libpostproc configuration: --extra-version=4:0.6.2-1ubuntu2+medibuntu1 --prefix=/usr --enable-avfilter --enable-avfilter-lavf --enable-vdpau --enable-bzlib --enable-libdirac --enable-libgsm --enable-libopenjpeg --enable-libschroedinger --enable-libspeex --enable-libtheora --enable-libvorbis --enable-pthreads --enable-zlib --enable-libvpx --disable-stripping --enable-runtime-cpudetect --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-version3 --enable-vaapi --enable-libopenjpeg --enable-libfaac --enable-nonfree --enable-gpl --enable-postproc --enable-swscale --enable-x11grab --enable-libfaad --enable-libdirac --enable-libfaad --enable-libmp3lame --enable-librtmp --enable-libx264 --enable-libxvid --enable-libdc1394 --enable-shared --disable-static libavutil 50.15. 1 / 50.15. 1 libavcodec 52.72. 2 / 52.72. 2 libavformat 52.64. 2 / 52.64. 2 libavdevice 52. 2. 0 / 52. 2. 0 libavfilter 1.19. 0 / 1.19. 0 libswscale 0.11. 0 / 0.11. 0 libpostproc 51. 2. 0 / 51. 2. 0 Use -h to get full help or, even better, run 'man ffmpeg' Hyper fast Audio and Video encoder usage: ffmpeg [options] [[infile options] -i infile]... {[outfile options] outfile}... I'm particularly interested in --enable-librtmp, as I need to use FFMPEG to stream to a RTMP server. I've followed numerous tutorials, (one in particular had me install a DAG repository which no longer exists x|) but aren't really having too much luck. How can I install it on my server with the functionality I need?
You can install it from source: svn checkout svn://svn.ffmpeg.org/ffmpeg/trunk ffmpeg cd ffmpeg ./configure make Then as root: make install Check the dependency list of ffmpeg, before you carry out the above steps. Make sure you have all the necessary packages. Alternatively, the dependencies can also be installed from source. For example, to compile faac and faad, download and extract faac and faad from audiocoding.com. As above, run ./configure make su -c 'make install' Install LAME if you want mp3 support. Download codecs: git clone git://git.videolan.org/x264.git cd x264 ./configure make when "configuring ffmpeg", instead of just typing "./configure", you can specify which modules you want to enable as follows: ./configure --enable-gpl --enable-nonfree --enable-pthreads --enable-libx264 --enable-libfaac --enable-libfaad --enable-libmp3lame make Do then: su -c 'make install'
Install FFMPEG on RHEL/CentOS
1,422,617,062,000
I have little background in programming and need to create a batch to extract the audio of multiple video files. Execution is done through the context menu in Nautilus/Gnome Files, stored in Nautilus' scripts folder as a bash .sh. The following code works for 1 file, but when selecting multiple files it doesn't. Could someone please help me modify the code to make it work. #!/bin/bash FILENAME=$(echo $NAUTILUS_SCRIPT_SELECTED_FILE_PATHS | sed -e 's/\r//g') FILENAME2=$(echo "$FILENAME" | cut -f 1 -d '.') ffmpeg -i "${FILENAME}" -vn -acodec pcm_s16le -ac 2 -ar 48000 "${FILENAME2}".wav # finished message box zenity --info --title "Procesing completed" --text "${FILENAME2}.wav at 48kHz has been generated." --width=700
Use this script, cannot test it with ffmpeg but it should work. #!/bin/bash { readarray FILENAME <<< "$(echo -e "$NAUTILUS_SCRIPT_SELECTED_FILE_PATHS" | sed -e 's/\r//g')" echo -e "Logs: $(date)\n" > ~/Desktop/data.txt for file in "${FILENAME[@]}"; do file=$(echo "$file" | tr -d $'\n') echo "Current file: $file" >> ~/Desktop/data.txt ffmpeg -i "$file" -vn -acodec pcm_s16le -ac 2 -ar 48000 "${file%.*}.wav" zenity --info --title "Procesing completed" --text "${file%.*}.wav at 48kHz has been generated." --width=700 done } 2>~/Desktop/ffmpeg.logs The code above will print a message with zenity each time a mp4 is processed. But if you want to display the message when all files are processed then you can use this script: #!/bin/bash { readarray FILENAME <<< "$(echo -e "$NAUTILUS_SCRIPT_SELECTED_FILE_PATHS" | sed -e 's/\r//g')" echo -e "Logs: $(date)\n" > ~/Desktop/data.txt for file in "${FILENAME[@]}"; do file=$(echo "$file" | tr -d $'\n') echo "Current file: $file" >> ~/Desktop/data.txt ffmpeg -i "$file" -vn -acodec pcm_s16le -ac 2 -ar 48000 "${file%.*}.wav" done zenity --info --title "Procesing completed" --text "$( printf "%s.wav\n" "${FILENAME[@]%.*}") at 48kHz has been generated." --width=700 } 2>~/Desktop/ffmpeg.logs I suggest you use this script. Because is able to detect what files failed and what were generated successfully: #!/bin/bash { readarray FILENAME <<< "$(echo -e "$NAUTILUS_SCRIPT_SELECTED_FILE_PATHS" | sed -e 's/\r//g')" echo -e "Logs: $(date)\n" > ~/Desktop/data.txt okFiles=() errFiles=() for file in "${FILENAME[@]}"; do file=$(echo "$file" | tr -d $'\n') echo -e "\n===========================" >> ~/Desktop/data.txt echo "Current file: $file" >> ~/Desktop/data.txt ffmpeg -i "$file" -vn -acodec pcm_s16le -ac 2 -ar 48000 "${file%.*}.wav" && { okFiles+=("${file%.*}.wav") : } || { errFiles+=("${file%.*}.wav") } done if [[ ${#okFiles[@]} -gt 0 ]]; then zenity --info --title "Procesing completed" --text "$(printf '%s\n' ${okFiles[@]})\n at 48kHz have/has been generated." --width=700 fi if [[ ${#errFiles[@]} -gt 0 ]]; then zenity --info --title "Error while processing some files" --text "Following files:\n$(printf "%s\n" "${errFiles[@]}")\ncould not be generated." --width=700 fi } 2>~/Desktop/ffmpeg.logs About: { code code } 2>~/Desktop/ffmpeg.logs I used that to be able to detect what fails when you are processing every file. For example, if in some file ffmpeg throws an error you will be able to check the logs inside the path ~/Desktop/ffmpeg.logs Btw, if you want every file processed to be located on specific path and not where you call the script you can do something like this (before readarray): { cd ~/Audios/path/to/dir #the path you want can be placed here readarray ... code } 2>~/Desktop/ffmpeg.logs Finally, you can notice that FILENAME2 is no longer needed, because I use "${file%.*}.wav" instead (see bash parameter expnasion).
Nautilus Script for multiple files (ffmpeg)
1,422,617,062,000
I have a huge number of images from which I'd like to feed some to ffmpeg time to time. But I only want to feed ones that are alphabetically after certain image (last frame of previous run, name stored in some file). Can I for example find out an order/index number of that one file and then do head/tail using that number? Or is there some magical -pattern_type glob I could use as ffmpeg -i parameter? Best filtering solution so far seems to be this but it seems a bit heavy: find . -maxdepth 1 -type f | sort | awk '$0 > "./picture_2022-04-22_13-46-12.jpg"' One alternative would be to put the list into text file, do parsing there and feed the text file to ffmpeg but I'd like to think there is some simpler way?
After sorting, some sed or awk could be used to match from pattern until the end of the stream. I assume that your final ffmpeg command accepts a list of file arguments. I use a printf instead of ffmpeg below. find . -type f -print0 | sort -z | sed -nz '/pattern/,$p' | xargs -r0 printf '%s\n' GNU arguments separation is used. sed command filters the sorted arguments, from pattern (including) to the end of the stream. If you want to get files from pattern until, not the end, but until a second pattern, the only modification is for the sed, it becomes sed -zn '/pattern1/,/pattern2/p'. A shorter alternative for your case (depth 1) where we don't test for regular files, would be: printf '%s\0' ./*.jpg | sed -nz '/pattern/,$p' | xargs -r0 printf '%s\n' Here the files are already alphabetically sorted after the first step. Also, you can compare with a string, not necessarily existing in the filenames, excluding or including, preferably using awk like you already do. For example, get all files named with a date later than 2022-01: printf '%s\0' ./*.jpg | awk 'BEGIN {RS=ORS="\0"} $0 > "./picture_2022-01"' | xargs -r0 printf '%s\n'
Filtering file list to show only files alphabetically after certain file
1,422,617,062,000
I am trying to split a video video.avi (with N being the total number of frames) into contiguous chunks according to a fixed number of frames (say M = 100). Specifically this process should yield: video_0.avi: Frames 0 to M-1 video_1.avi: Frames M to 2M-1 video_2.avi: Frames 2M to 3M-1 ... It is important that each chunk video_*.avi has exactly M frames. Therefore, the last chunk can be discarded if N is not divisible by M. I found related issues (e.g. Split video file into pieces with ffmpeg) but find myself struggling to replace the specification in seconds with the number of frames.
For the first chunk, you should be able to use -vframes M-1 and it will be OK (even for -c copy if I recall). The other chunks are trickier, and I've had no success with: -vf 'select=gte(n\,M)' -- it needs re-encoding anyway, but does strange things with different codecs -- e.g. leaves the preceding duration with a still frame, and plays the sound... You can get the time in seconds by dividing the frame numbers through FPS, e.g. if your FPS is $fps and M is $M: ffmpeg -i video.avi -ss `echo "scale=6; $M / $fps" | bc` -to `echo "scale=6; (2 * $M) / $fps" | bc` video_01.avi etc. If you have a scripting language to compute these times, it will be easier. Also, note that when re-encoding (i.e. without -c copy), -ss and -to will be precise, see also: https://superuser.com/questions/459313/how-to-cut-at-exact-frames-using-ffmpeg
Split video file into chunks using FFMPEG and exact number of frames
1,422,617,062,000
I am still pretty new to linux and have been trying very hard at getting this right. Please help me to get this right. I am trying to merge 2 videos (1 from each folder) multiple times in like a batch process, automatically 1 set after the next. I am trying to do it with ffmpeg and for loop in order to take one file from top of list of folder-1 and merge it with one file from top of list of folder-2, and then repeat the process all the way down the folder lists until all videos have been paired up with one another. To imagine this, picture 2 folders side by side with files in each, now line up the files from the folder on the left to the folder on the right. I want to merge 2 videos into 1 multiple times. I was going to draw a diagram but I think its understood, hopefully. Here is my code, I have changed this so many times but my latest one was: (and I attempted to run this in the directory of folder-1, hoping it would read the files and join 1 to 1 from folder-2, but sadly no luck. for filename in *.mp4; do folder2="/path/to/folder2" xout="/output/file/as/$(date +'%Y-%m-%d_%H:%M:%S').mp4" ffmpeg -f concat -i "${filename%}" -i "$vid2/${filename%}" -acodec copy -vcodec copy "$xout" done Here is another attempt that is giving me same errors. No such file or directory for filename in *.mp4; do vid1="/path/folder-1/${filename%.*}" vid2="path/folder-2/${filename%.*}" out1="/path/output/$(date +'%Y-%m-%d_%H:%M:%S').mp4" ffmpeg -f concat -i "$vid1" -i "$vid2" -acodec copy -vcodec copy "$out1" done Can anyone please, please, please tell me what it is I am doing wrong, I can't get this right, it has been around 4 hours and I have tried so many things and read up on so many articles regarding for loops, while loops, ffmpeg commands, etc. Thank you so much for your precious time, it is greatly appreciated!
In your first example, ${filename%} doesn't change $filename at all, and as you told ffmpeg to open the .mp4 file with the concat demuxer -f concat, the error message should have been <actual name of $filename>: Invalid data found when processing input, but you recieved No such file or directory, so I suspect the glob was not working - maybe your working directory wasn't actually folder-1, in which case the full 'file not found' error message from ffmpeg would have been *.mp4: No such file or directory - the glob didn't match any files, so the parameter filename was set to <literal asterisk><dot>mp4. In your second example, the paramater subtitution ${filename%.*} is removing .mp4 from the end of the names you are providing to ffmpeg - possibly why you recieved No such file or directory. In addition, in both your examples the usage of ffmpeg's concat demuxer is incorrect. The concat demuxer requires a text file as its input (or appropriate shell substitution as used in the below example with <()). In your example, input files were specified directly which is what you might do if you wanted to mux all the streams to a single container - the streams would be in "parallel" (eg. to add subtitle streams, or a secondary audio track). Concatenation will join the files sequentially. If what you want is to concatenate an .mp4 file in folder-1 with another .mp4 file in folder-2, where the filenames match... Here's an example which I have tested, using absolute paths - /tmp/a and /tmp/b are used instead of /path/folder-1 and /path/folder-2, /tmp as /path/output: seconddirectory="/tmp/b" for i in /tmp/a/*.mp4 do if ! [[ -e "$seconddirectory/${i##*/}" ]] then >&2 echo "no matching file in $seconddirectory for $i" continue fi out="${i##*/}" out="/tmp/${out%.*}-$(date +'%Y-%m-%d_%H:%M:%S')" ffmpeg -f concat -safe 0 -i <(printf '%s\n' "file '$i'" "inpoint 0" "file '$seconddirectory/${i##*/}'" "inpoint 0") -c copy "$out.mp4" done This will output all .mp4 files in /tmp/a for which a matching filename exists in /tmp/b to /tmp/*-date.mp4, concatenating the matching files. (note: don't use only the date for the output as it will cause conflicting filenames - use something unique to avoid this - in this example the basename of the input file/s is used). The ${i##*/} substitutions are to remove the path component from the absolute paths, leaving just the filename component - using absolute paths will mean that the current working directory won't interfere with * glob matching. If you wanted to join files with non-matching filenames, you would need to work something different out. eg. to match the first file from each folder then the second, etc. (in the order that bash glob sorts them): a=(/tmp/a/*.mp4) b=(/tmp/b/*.mp4) a=("${a[@]:0:${#b[@]}}") b=("${b[@]:0:${#a[@]}}") for (( i=0; i<${#a[@]}; i++ )) do out="${a[i]##*/}" out="${out%.*}-${b[i]##*/}" out="/tmp/${out%.*}-$(date +'%Y-%m-%d_%H:%M:%S')" ffmpeg -f concat -safe 0 -i <(printf '%s\n' "file '${a[i]}'" "inpoint 0" "file '${b[i]}'" "inpoint 0") -c copy "$out.mp4" done This uses array variables, in conjuction with globbing to build lists of .mp4 files in each directory, and will concatenate a pair (one from each list), until no more pairs exist. Instead of the process substitution <(), you could match pairs of input files and write them to a text file in the format required by ffmpeg, then process that file. eg. for the first example it would look like: seconddirectory="/tmp/b" for i in /tmp/a/*.mp4 do if ! [[ -e "$seconddirectory/${i##*/}" ]] then >&2 echo "no matching file in $seconddirectory for $i" continue fi out="${i##*/}" out="/tmp/${out%.*}-$(date +'%Y-%m-%d_%H:%M:%S')" printf '%s\n' "file '$i'" "inpoint 0" "file '$seconddirectory/${i##*/}'" "inpoint 0" > "$out.ffcat" done for i in /tmp/*.ffcat do ffmpeg -f concat -safe 0 -i "$i" -c copy "${i/%.ffcat/.mp4}" done An alternative to ffmpeg for this job would be mkvmerge (from mkvtoolnix). It offers a way to concatenate files without needing a text file as input. In the first example from above the entire ffmpeg line could be replaced with: mkvmerge -o "$out.mkv" "$i" + "$seconddirectory/${i##*/}" The resulting output file will be in a .mkv matroska container, instead of the .mp4 container used in ffmpeg example above. Putting all this together in a reusable function: function concatenation_example() { local a b c i out mf if type mkvmerge >/dev/null then mf=m elif type ffmpeg >/dev/null then mf=f else >&2 echo "This function won't work without either mkvmerge or ffmpeg installed." return 1 fi if [[ ! -d "$1" || ! -d "$2" || ! -d "$3" ]] then >&2 printf '%s\n' "concatenation_example FIRSTDIR SECONDDIR OUTDIR" "all arguments must be directories" return 1 fi for i in "$1"/*.mp4 do if ! [[ -e "$2/${i##*/}" ]] then >&2 echo "no matching file in $2 for $i" continue fi out="${i##*/}" out="$3/${out%.*}-$(date +'%Y-%m-%d_%H:%M:%S')" case "$mf" in (m) mkvmerge -o "$out.mkv" "$i" + "$2/${i##*/}" ;; (f) ffmpeg -f concat -safe 0 -i <(printf '%s\n' "file '$i'" "inpoint 0" "file '$2/${i##*/}'" "inpoint 0") -c copy "$out.mp4" ;; esac done }
ffmpeg merge multiple sets of 2 videos in for loop
1,422,617,062,000
I've got a video file with 1.0 FPS (i.e. one frame per second) and would like to re-encode it so that it plays ca 20x faster. A short 1 FPS sample is here: http://s3.aws.nz/cam-1537668742.mp4 I can play it 20x faster using mplayer like this: mplayer -speed 20 cam-1537668742.mp4 How can I save it as a video file at this speed? I tried ffmpeg's setpts filter, it made the playback faster but still at 1 FPS, i.e. it the picture changed only every one second even though more time has lapsed in the meantime. Any idea? Either using mplayer, ffmpeg or some other Linux tool? Thanks!
Basic template is ffmpeg -i in -vf "setpts=(PTS-STARTPTS)/20,fps=20" out or ffmpeg -i in -vf "setpts=(PTS-STARTPTS)/20" -r 20 out Without the fps filter or -r option, ffmpeg will assume the framerate of the output stream is still 1 fps and so will drop 19 out of each 20 retimed frames.
How to speed up video to make timelapse?
1,422,617,062,000
With the following command i'm trying to capture 10fps and send them the device driver. Now i want 8 bit gray raw frames (640x480= 307200 bytes per frame) to be send to the device driver. I can't figure out how to set the output with ffmpeg or the intput with v4l2 to this format. ffmpeg -f v4l2 -r 25 -s 640x480 -i /dev/video0 -f rawvideo "/dev/devicedriver" ffmpeg doesn't seem to support 8-bit gray output. And with v4l2 i can't figure out how to set it. It seems it doesn't recoginize V4L2_PIX_FMT_GREY. v4l2-ctl --set-fmt-video-out=width=640,height=480,pixelformat=V4L2_PIX_FMT_GREY I came up with a solution combining python, opencv and ffmpeg: import cv import sys import os import time camera_index = 0 capture = cv.CaptureFromCAM(camera_index) count = 0 def repeat(): global capture global camera_index global count frame = cv.GetMat(cv.QueryFrame(capture)) framegray = cv.CreateMat(480, 640, cv.CV_8UC1) cv.CvtColor(frame, framegray, cv.CV_BGR2GRAY) sys.stdout.write(framegray.tostring()) c = cv.WaitKey(1) if c == 27: print count sys.exit() while True: repeat() and then pipe it to ffmpeg python capturewebcam.py | ffmpeg -f rawvideo -pix_fmt gray -s 640x480 -i - -an -f rawvideo -r 10 "/dev/null" But i think it really has to be possible with just ffmpeg and v4l2, and i can't figure out how. My head hurts from reading the documentation :p.
First, check pixel formats are supported by your output device driver: v4l2-ctl --list-formats -d /dev/devicedriver the pixelformat you want to pass to the v4l2-ctl command line is the fourcc shown in the result, eg: Pixel Format : 'YUYV' in this case your command line would be: v4l2-ctl --set-fmt-video-out=width=640,height=480,pixelformat=YUYV If you need V4L2_PIX_FMT_GREY probably the fourcc will be 'GREY' (I guess from videodev2.h, can't check) If it is not in the result of the list-formats command it is not supported by the driver, so you'll need some conversion from the source (input/camera) format to the output.
ffmpeg webcam to device driver , output 8 bit grayscale
1,422,617,062,000
when I get a flash video from YouTube, why is the quality of the audio much worse than the origin video on YouTube? When I downloaded the flash movie, I convert it to avi like this: ffmpeg -i ~/"$2.flv" -sameq -acodec libmp3lame -vol 200 -ar 44100 -aq 300 -ab 2097152 ~/"$2.avi" I already set -aq (audio quality) to 300, but no difference to 100 or 200. Moreover 100 is the max. value in my opinion. -ar (frequecy) 44100 should be ok too and the bitrate in bit/s (-ab) should be 256kb/s (2097152 / 1024 / 8). I am not sure what is the right bitrate for a good quality but I think 256kb/s should be fine. Or did I calculate it wrong? What could be the problem?
This is the command line you want: ffmpeg -i ~/test.flv -acodec libmp3lame -qscale 8 test.avi Using the video you suggested as example i have almost the same quality in vlc as original (original has aac encoding). You were specifying a way too high bitrate (2Mb/sec, 192kb/sec is far enough), i don't think it had any collateral effect on your command line though. The difference is made by -qscale 8 which let ffmpeg output a VBR mp3 instead of a CBR stream.
ffmpeg and libmp3lame produces bad audio quality?
1,422,617,062,000
trying to install opencv on my centos6 and always got this error meesage fatal error: sys/videoio.h: No such file or directory #include <sys/videoio.h> Anybody understand what sys/videoio.h is? where do I get a file like this one?
Here it is a workaround by deselecting WITH_V4L and selecting WITH_LIBV4L in cmake-gui which stop checking for sys/videoio.h.Again,same as other solutions I posted, I do not known why it is working this way.
what is "sys/videoio.h"?
1,479,208,613,000
I'm working on a live stream transcoder application using nginx + ffmpeg. Everything works fine when I use avconv to transcode, but if I use ffmpeg, I get this error: [tcp @ 0xb4e9da0] Failed to resolve hostname fso.dca.XXXX.edgecastcdn.net: System error Any hints? Seems like an application specific firewall.
The problem is with the static build, as @slm mentioned. I've compiled ffmpeg from source and things work fine now.
Application specific DNS problem?
1,479,208,613,000
I'm using ffmpeg's x11grab to do some screencasting. It works pretty well except on 3D stuff. In particular it seems like 3D draw areas flicker in and out. You can see an example of it here. The issue is present even when I capture only the screen (i.e., not adding in all the other fancy stuff and the webcam capture). I've done a lot of googling on this issue and have found people with a similar issue, but no solution. Many suggest that it is due to OpenGL rendering directly to the hardware and bypassing X11 entirely. Does anyone know of a way to deal with this? If it matters I'm using an nVidia graphics card.
I finally resolved it! The problem was to do with OpenGL as I suspected. To solve the issue, I downloaded VirtualGL. Specifically I grabbed the .deb file from here and installed it with dpkg. Running my applications with vglrun application and then starting the screencast now works perfectly, it even runs more smoothly than it did without vgl.
x11grab flickers in OpenGL draw areas
1,479,208,613,000
I have many folders in one directory, each containing 2 mp3 files. Hereby I can find the files: first=$(find ./*/* -type f | sort | awk 'NR % 2 == 1') second=$(find ./*/* -type f | sort | awk 'NR % 2 == 0') I want to concatenate the first file with the second one in each folder: ffmpeg -i "concat:$first|$second" -c copy "both_"$first I have found this stackoverflow answer, but I don't know if and how I can use it for my purpose. directory structure: folder "files" | |--folder1 | |--001_001.mp3 | |--001_003.mp3 | |--folder2 |--001_004.mp3 |--001_009.mp3 ... Any help would appreciated.
I think it would be simpler and more robust to not rely on find in this instance. You have a well defined directory structure and there's really no reason to use find to traverse it since you know exactly where your files are. Instead, use a shell loop: for dirpath in files/*/; do set -- "$dirpath"/*.mp3 ffmpeg -i "concat:$1|$2" -c copy "$dirpath/both_${1##*/}" done or alternatively, for dirpath in files/*/; do ( cd "$dirpath" set -- *.mp3 ffmpeg -i "concat:$1|$2" -c copy "both_$1" ) done This would loop over your folder1, folder2 etc. directories, with the pathname of each directory in $dirpath. For each directory pathname, the body of the loop would expand the *.mp3 globbing pattern in that directory and then use the first two matches of that pattern to run the ffmpeg command. No check is done to make sure that we actually get two matches of the pattern. The difference between the two loops above is that the first loop does not change its directory to each of the $dirpath directories, and therefore $1 and $2 will contain the directory path of the directory. This is why I use "$dirpath/both_${1##*/}" to create the path to a file called both_something.mp3 in the directory (the directory path has to be stripped off from the start of $1 to insert the both_ substring). The other loop uses cd to change directory into each directory in the loop. This way, the $1 and $2 strings would be filenames rather than pathnames with an initial directory path. Since the cd happens within a subshell (the (...)), the change of the working directory is only affecting the commands within the parentheses.
Loop through folders and concatenate mp3 files with ffmpeg
1,479,208,613,000
I am trying to convert a GIF to a MP4. I am getting an error while doing that with specific parameters. What am I doing wrong? Any help would be nice. I would like a mp4 video with high quality, size is not a problem. Command and log : ffmpeg -i So_gehts.gif -c:v libvpx -crf 4 -b:v 500K output.mp4 ffmpeg version 2.7.6-0ubuntu0.15.10.1 Copyright (c) 2000-2016 the FFmpeg developers built with gcc 5.2.1 (Ubuntu 5.2.1-22ubuntu2) 20151010 configuration: --prefix=/usr --extra-version=0ubuntu0.15.10.1 --build-suffix=-ffmpeg --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --enable-gpl --enable-shared --disable-stripping --enable-avresample --enable-avisynth --enable-frei0r --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmodplug --enable-libmp3lame --enable-libopenjpeg --enable-openal --enable-libopus --enable-libpulse --enable-librtmp --enable-libschroedinger --enable-libshine --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libxvid --enable-libzvbi --enable-opengl --enable-x11grab --enable-libdc1394 --enable-libiec61883 --enable-libzmq --enable-libssh --enable-libsoxr --enable-libx264 --enable-libopencv --enable-libx265 libavutil 54. 27.100 / 54. 27.100 libavcodec 56. 41.100 / 56. 41.100 libavformat 56. 36.100 / 56. 36.100 libavdevice 56. 4.100 / 56. 4.100 libavfilter 5. 16.101 / 5. 16.101 libavresample 2. 1. 0 / 2. 1. 0 libswscale 3. 1.101 / 3. 1.101 libswresample 1. 2.100 / 1. 2.100 libpostproc 53. 3.100 / 53. 3.100 Input #0, gif, from 'So_gehts.gif': Duration: N/A, bitrate: N/A Stream #0:0: Video: gif, bgra, 374x461, 29.25 fps, 100 tbr, 100 tbn, 100 tbc File 'output.mp4' already exists. Overwrite ? [y/N] y [libvpx @ 0x16a5940] v1.4.0 [mp4 @ 0x168ec60] Could not find tag for codec vp8 in stream #0, codec not currently supported in container Output #0, mp4, to 'output.mp4': Metadata: encoder : Lavf56.36.100 Stream #0:0: Video: vp8 (libvpx), yuva420p, 374x461, q=-1--1, 500 kb/s, 100 fps, 100 tbn, 100 tbc Metadata: encoder : Lavc56.41.100 libvpx Stream mapping: Stream #0:0 -> #0:0 (gif (native) -> vp8 (libvpx)) Could not write header for output file #0 (incorrect codec parameters ?): Invalid argument Any help would be nice. Thank you.
From your error message... codec not currently supported in container I don't think you can use VP8 with an MP4. Try a different codec or container format? I've provided some examples with links to documentation below. MP4 w/ x264: ffmpeg -i So_gehts.gif -c:v libx264 -crf 4 -b:v 500K output.mp4 MP4 w/ x264 (lossless): ffmpeg -i So_gehts.gif -c:v libx264 -preset veryslow -qp 0 output.mp4 WebM w/ VP8: ffmpeg -i So_gehts.gif -c:v libvpx -crf 4 -b:v 1M output.webm WebM w/ VP9: ffmpeg -i So_gehts.gif -c:v libvpx-vp9 -crf 0 -b:v 0 output.webm
FFMpeg : GIF to MP4 conversion throws code error.
1,479,208,613,000
I've an .m4a audio file and wish to take the front 8 secs away and keep the rest of the file intact, then once this first step is done discard the last 8 seconds of the file and keep the rest of the file intact. So in essence the first 8 seconds will be completley discarded / removed from the file, and the file will start at the 8th second as if the previous 8 seconds never existed. Similarly the last 8 seconds of the file will be discarded. Do this all without re-encoding the file. (EDIT: Other answers on here and elsewhere I have seen, but I could not get to work becasue other answers require the start and end time of the trimmed part to be given. What I need is ffmpeg to provide the end timestamp and start time stamp of the .m4a audio file without having to work this out and feed it into the command) I have this for trimming the front of the file ffmpeg -t 00:00:08 -acodec copy -i in_file.m4a out_file.m4a but nothing for trimming the end of the file. I can't get what I have to work. (I've seen some other answers here and elsewhere, but nothing that seems to get me there)
This will trim the first 8 seconds from the front of the file without re-coding, and retain everything except the first 8 seconds in the output file ffmpeg -ss 8 -i in_file.m4a -c copy out_file.m4a In addition, the below line will trim the last 8 seconds from the end of the audio file. Below line seems convoluted but is simplest solution I could see that will trim the end of the file: ffmpeg -i in_file.m4a -ss 8 -i in_file.m4a -c copy -map 1:0 -map 0 -shortest -f nut - | ffmpeg -y -f nut -i - -map 0 -map -0:0 -c copy out_file.m4a
How to trim first 8 secs & last 8 secs from the front and then the end of a m4a audio file and keep the rest of the file
1,479,208,613,000
I am running some commands in bash (basically some ffmpeg commands), that I'm using grep and awk to filter out the results. The command takes some time and continously outputs some results as it progresses through the video. The grep pipe is the same. But the awk pipe waits until the command is completed and prints all at once, which is not good. I want it to output as soon as it found a match. How to change this command to output results in real-time as it progresses? ffmpeg -i freeze.mp4 -vf "freezedetect=n=-60dB:d=2" -map 0:v:0 -f null - 2>&1 | grep freezedetect | awk '{print $4,$5}' | tr -d , | grep lavfi
With standard tools, try ffmpeg -i freeze.mp4 -vf "freezedetect=n=-60dB:d=2" -map 0:v:0 -f null - 2>&1 | stdbuf -o 0 grep freezedetect | stdbuf -o 0 awk '{print $4,$5}' | stdbuf -o 0 tr -d , | stdbuf -o 0 grep lavfi
Pipe and filter bash outputs in realtime
1,479,208,613,000
I'm using ffmpeg combined with tee and mplayer to have a simple video livestream and recorder over SSH. Now, I'd love to embed the current (server) time in the stream. The format doesn't matter much, ideally would be YYYY-MM-DD HH:MM:SS. I've found this how-to suggesting the following command: ffmpeg -f video4linux2 -input_format mjpeg -s 1280x720 -i /dev/video0 \ -vf "drawtext=fontfile=/usr/share/fonts/dejavu/DejaVuSans-Bold.ttf: \ text='%{localtime\:%T}': [email protected]: x=7: y=700" -vcodec libx264 \ -preset veryfast -f mp4 -pix_fmt yuv420p -y output.mp4 But using the -vf portion of this results in the following error for me: [NULL @ 0x55e4ef96b2a0] Unable to find a suitable output format for 'text=%{localtime:%T}:' text=%{localtime:%T}:: Invalid argument I've also tried this command (the third code block in the Q), just to test it, but it failed aswell: [Parsed_drawtext_0 @ 0x55fb2862a940] Both text and text file provided. Please provide only one [AVFilterGraph @ 0x55fb28629fc0] Error initializing filter 'drawtext' with args 'text=%{pts:hms:1553359336.166336638}' Finally, I also found this question, and tried it, slightly modified like this (this is showing my full command, I've only adjusted the -vf options in the past two examples): ffmpeg -r 20 -s 640x480 -f video4linux2 -i /dev/video0 -vf "drawtext=fontsize=90:fontcolor=white: fontfile=/usr/share/fonts/TTF/DejaVuSans.ttf: text='%{localtime\:%X}'" -f avi - This presents me with the following error: [NULL @ 0x55ae61d051a0] Unable to find a suitable output format for 'fontfile=/usr/share/fonts/TTF/DejaVuSans.ttf:' fontfile=/usr/share/fonts/TTF/DejaVuSans.ttf:: Invalid argument Modifying what I've found here a bit to: -vf drawtext="fontfile='/usr/share/fonts/cantarell/Cantarell-Light.otf':fontsize=14:fontcolor=white:shadowcolor=black:shadowx=2:shadowy=1:text='%H-%M-%S':x=8:y=8" Presents me with: [Parsed_drawtext_0 @ 0x55e36db2aa40] Stray % near 'H-%M-%S' Note that using this comment with just text (e.g. text='test') works fine.. Adding 1-4 \s in front of the %s shows no change in error or effect. My ffmpeg version: ffmpeg version 3.2.12-1~deb9u1 on a debian machine. It has been compiled with the drawtext library and it is present in ffmpeg -filters.
Okay, I've found the solution in the FFmpeg filter documentation. 10.52.2 Text expansion If expansion is set to strftime, the filter recognizes strftime() sequences in the provided text and expands them accordingly. Check the documentation of strftime(). This feature is deprecated. Though it says This feature is deprecated., this works just fine for me. Here is my final -vf: -vf drawtext="expansion=strftime:fontfile='/usr/share/fonts/cantarell/Cantarell-Light.otf':fontsize=14:fontcolor=white:shadowcolor=black:shadowx=2:shadowy=1:text='%Y-%m-%d\ %H\\\\:%M\\\\:%S':x=8:y=8" Please note that whyever, the four \\\\ are neccessary to escape :. To escape the space, a single \ is sufficient. The result will look like this:
How to embed current time with ffmpeg?
1,479,208,613,000
Have a script that generates a rtmp stream that runs inside a screen using ffmpeg but this fails for some reason? If it run it directly in the command line it works so I tried to run it by opening > screen , running it and closing the screen with ctrl + d but even so it sometimes closes for reason. Is there a way to log the stderr from ffmpeg inside the script to see why the command fails inside the script but works when run directly? #!/bin/bash if [ "$1" = "run" ]; then source $HOME/.bash_aliases while [ 1 ]; do ffmpeg -f x11grab -s 1000x563 -framerate 30 -i $DISPLAY+10,151 -f pulse -ac 2 -i default -c:v libx264 -preset ultrafast -tune zerolatency -x264opts keyint=30:min-keyint=10 -b:v 2000k -pix_fmt yuv420p -s 1000x562 -c:a aac -b:a 160k -ar 44100 -t '4:00:00' -threads 0 -f flv rtmp://IPADDRESS:1935/live1/$CHANNEL done else killall -9 ffmpeg > /dev/null 2>&1 sleep 5s; screen -ls | grep ffmpeg > /dev/null 2>&1 if [ "$?" = "1" ]; then screen -mdS ffmpeg $0 run fi fi Thank you ~
Is there a way to log the stderr from ffmpeg inside the script to see why the command fails inside the script but works when run directly? Here you go, forward stream 2, stderr, to a file in tmp. Oh, and because you do not want it to overwrite the tmp file all the time, add the PID to its name: my_command 2> /tmp/ffmpeg_error$$.log Now, applied to your ffmpeg: ffmpeg -f x11grab -s 1000x563 -framerate 30 -i $DISPLAY+10,151 -f pulse -ac 2 -i default -c:v libx264 -preset ultrafast -tune zerolatency -x264opts keyint=30:min-keyint=10 -b:v 2000k -pix_fmt yuv420p -s 1000x562 -c:a aac -b:a 160k -ar 44100 -t '4:00:00' -threads 0 -f flv rtmp://IPADDRESS:1935/live1/$CHANNEL 2> /tmp/ffmpeg_error$$.log
ffmpeg command fails silently inside script but works when run directely
1,479,208,613,000
I have a videos with low framerates (1 to 3). And I want make video 30 fps from it. I search proper video filter. And this is not motion interpolate, but corresponding pixel to pixel interpolate of two neighboring frames. Thus I want recieve 29 frames which will be provides smooth transition between frames with ffmpeg.
The filter I need is called "framerate". This is very simple and light. With it I can easily make a slideshows. ffmpeg -i in.mp4 -vf framerate='fps=60:interp_start=1:interp_end=254' out.mp4
ffmpeg smooth transition between frames
1,479,208,613,000
I have an app that I built that simply plays an icecast feed from the internet and if the feed is gone, it plays a backup feed or some local audio files. I need a way to show a representation of the playing audio on a dashboard that runs on a nginx server on the same machine. I would like it to be realtime if possible but just showing audio moving is good enough. Also I would use this to do some scripting in case the audio is silent from the stream. Thanks!
The simplest (though definitely not cheapest, CPU-wise) way to do this that I can come up with is to have ffmpeg output an image with a loudness meter every once in a while, in addition to its normal output. You can do this something like this: ffmpeg -i «INPUT» \ -filter_complex '[0:a]ebur128=video=1:meter=18:metadata=0[g][j], [j]anullsink, [g]fps=1[g]' \ -map '[g]' -f image2 -update 1 -y «IMAGEFILE».png \ -map '0:a' -c:a copy -y «AUDIO_OUTPUT» That should output an image once per second (though note that's per second of processed audio; ffmpeg will as typical run as fast as the CPU allows, at least if the output will accept data that fast. I'm presuming your output limits it to running at 1x). You can change how often the image updates by changing the fps= value; 2 would mean twice per second and 0.5 would mean every 2s. Obviously, if ffmpeg exits, the image will just stop being updated. Similarly if it is stopped e.g., because the output will not take more data, or if the input has none. The modification time on the image will make it obvious enough this has happened. You could also have ffmpeg put a timestamp on top of the image with the current time using the drawtext filter: ⋮ -filter_complex '[0:a]ebur128=video=1:meter=18:metadata=0[g][j], [j]anullsink, [g]fps=1, drawtext=text=%{localtime} %{pts}:x=60:y=460:fontcolor=Cyan[g]' ⋮ Then you don't even need PHP — you just need to serve a static image: PS: As for scripting if its silent, that'd really be best done in your app — and FYI you can have the same ebur128 filter output metadata in various formats (e.g., JSON) so you can check if its gone silent. Unfortunately it gets mixed in with other ffmpeg output, so parsing can be a bit annoying. I've written Perl code to do it, but you'll probably find it easy enough to do in the scripting language of your choice.
Using Alsa, How can I get the current levels of audio playing through ffmpeg or mpg123 to display on a web dashboard?
1,479,208,613,000
I have try to use ffmpeg to download youtube media in mp3. ffmpeg -i <url> -f mp3 output.mp3 It's working with other urls, but not with youtube-dl retrived youtube video urls. Ffmpeg returns error 403, forbidden. I can't download also with wget, but from browser and vnc player the url is working. I want to download separatly with ffmpeg, because I don't want to download in original format. Whats the problem? How can I fix it? The commands: ./youtube-dl -f bestaudio -g https://m.youtube.com/watch?v=D-dONCnY_Yg ffmpeg -i https://r1---sn-qxo7rn7e.googlevideo.com/videoplayback?signature=021CAFB9066554DD33675D89CC80D6E5FC616A7E.8A6222115FF91416C7F1B639B8F4A86671B40DD2&ipbits=0&sparams=clen%2Cdur%2Cei%2Cgir%2Cid%2Cip%2Cipbits%2Citag%2Ckeepalive%2Clmt%2Cmime%2Cmm%2Cmn%2Cms%2Cmv%2Cpl%2Crequiressl%2Csource%2Cexpire&fvip=1&source=youtube&id=o-AALFu428zi6lOqHvA5xGfADpvNCR_BXItpMzqWb73CSH&mm=31%2C26&expire=1520111723&dur=293.721&lmt=1508989837160273&key=yt6&ip=35.227.125.114&ms=au%2Conr&ei=C7yaWuSpMYj5qQWY_qH4DQ&mv=m&mt=1520090001&requiressl=yes&gir=yes&mn=sn-qxo7rn7e%2Csn-cvb7ln7l&clen=4618202&keepalive=yes&c=WEB&mime=audio%2Fwebm&pl=24&itag=251&ratebypass=yes -f mp3 output.mp3 And ffmpeg returns error 403 forbidden.
You forgot to quote the URL given to ffmpeg so the shell's consuming some of the characters as expressions or something else. ffmpeg -i "https://r1---sn-qxo7rn7e.googlevideo.com/videoplayback?signature=021CAFB9066554DD33675D89CC80D6E5FC616A7E.8A6222115FF91416C7F1B639B8F4A86671B40DD2&ipbits=0&sparams=clen%2Cdur%2Cei%2Cgir%2Cid%2Cip%2Cipbits%2Citag%2Ckeepalive%2Clmt%2Cmime%2Cmm%2Cmn%2Cms%2Cmv%2Cpl%2Crequiressl%2Csource%2Cexpire&fvip=1&source=youtube&id=o-AALFu428zi6lOqHvA5xGfADpvNCR_BXItpMzqWb73CSH&mm=31%2C26&expire=1520111723&dur=293.721&lmt=1508989837160273&key=yt6&ip=35.227.125.114&ms=au%2Conr&ei=C7yaWuSpMYj5qQWY_qH4DQ&mv=m&mt=1520090001&requiressl=yes&gir=yes&mn=sn-qxo7rn7e%2Csn-cvb7ln7l&clen=4618202&keepalive=yes&c=WEB&mime=audio%2Fwebm&pl=24&itag=251&ratebypass=yes" output.mp3 Of course, this link has probably expired and you'll need to get a fresh one from youtube-dl.
Ffmpeg - youtube-dl
1,479,208,613,000
While trying to solve an issue about loading MPEG videos in Matlab, I found several suggestions to install the FFmpeg plugin for Gstreamer. However, I can't find this functionality on Debian 9 [1]. It was available on Debian 7, though [2]. This is not really an XY question. I'm still looking for alternatives for my issue with Matlab, but I am also interested in knowing what changed in Gstreamer or if I'm assuming anything wrong. Is there another package that supplies the functionality I was expecting from gstreamer0.10-ffmpeg? [1] - https://packages.debian.org/search?suite=default&section=all&arch=any&searchon=names&keywords=gstreamer1.0-ffmpeg [2] - https://packages.debian.org/search?suite=default&section=all&arch=any&searchon=names&keywords=gstreamer0.10-ffmpeg
It was renamed to gstreamer-libav
What happened to gstreamer-ffmpeg and can I replace it?
1,479,208,613,000
I've already installed FFmpeg according to the ffmpeg Ubuntu compile guide. I can't use aac audio encoding and libx264, which I need. How do I install FFmpeg so that all the option below are enabled in the installation? Do I need to uninstall FFmpeg and start over again, or can I just add to what has already been installed?
I do this on a regular basis since I like to use ffmpeg's bleeding edge features now and then. For libfdk-aac and libx264, you want to install the respective development packages: sudo apt install libfdk-aac-dev libx264-dev Then I configure ffmpeg like this: ./configure --prefix=/opt/ffmpeg-$(cat RELEASE) --enable-gpl --enable-nonfree --enable-version3 --enable-libx264 --enable-libfdk-aac --enable-pthreads --enable-postproc --enable-gnutls --disable-librtmp --disable-libopencv --disable-libopenjpeg --enable-libpulse --arch=amd64 --disable-shared --enable-static --disable-doc --extra-cflags=--static --extra-libs="-ldl" --disable-outdev=alsa --disable-outdev=oss --disable-outdev=v4l2 --disable-outdev=sndio --disable-indev=alsa --disable-indev=oss --disable-indev=sndio --disable-indev=jack As you see, I explicitly enable libx264 and fdk-aac and I disable a lot of features I don't need. Your mileage may vary, of course. The fancy part is --prefix=/opt/ffmpeg-$(cat RELEASE) --disable-shared --enable-static --disable-doc --extra-cflags=--static --extra-libs="-ldl", which gives you a static compile. make install will put it into opt, so it does not conflict with the ffmpeg version provided by the package manager. I do not actually execute make install, though. Instead I use sudo checkinstall --pkgname="ffmpeg-$(cat RELEASE)" --pkgversion="$(cat RELEASE)~git$(git rev-parse --short HEAD)" --backup=no --deldoc=yes --fstrans=no --default to produce a Debian package I can uninstall without hassle. In case this information becomes obsolete, I probably update the corresponding post in my personal blog.
How to add aac and libx264 to FFmpeg installation?
1,479,208,613,000
I had a static installation of ffmpeg but was running into trouble with config files not found so I have since just deleted it. Then I tried re-installing a more current static version just to see if it differs from first one I installed a week ago and it does. The first one had the ffmpeg script which I ended up placing in /usr/bin and a ffmpeg-static folder with cache in root. The latest static I just installed has no static folder and just two scripts, the ffmpeg and ffprobe. I have a few questions: Is it best to place these scripts in usr/bin, does it matter? Does yum update static versions? Is it much better practice to compile ffmpeg than use static versions? Has anyone tried this type of ffmpeg installation as stated in the following link?
Q#1: Is it best to place these scripts in usr/bin, does it matter? Does yum update static versions? No if you download and install the static versions that the FFmpeg project provides on their website will not be managed by yum if you opt to install them into /usr/bin. I would probably not opt to install these to /usr/bin. I'd rather install them into a user's $HOME/bin directory instead, to keep them separate from packaged versions which may reside in /usr/bin. Q#2: Is it much better practice to compile ffmpeg than use static versions? I've been using the static builds of late on CentOS and Fedora, they simply work and are easy to install and move around. They include all the FFmpeg features and are the easiest by far. Compiling though easy enough, can present you with issues, and there isn't really that much upside, if you cannot find a pre-packaged version from a YUM repository. Q#3: Has anyone tried this type of ffmpeg installation as stated in the following link? Yes these types of installations are the preferred way to install packages from YUM repositories, when wanting to install software system-wide on a box.
ffmpeg installation on Linux REHEL/CentOs 6.X
1,479,208,613,000
Running Ubuntu 13.10 with a fully compiled ffmpeg. I know the code for the actual conversion is ffmpeg -i video.mp4 -codec copy video.avi I just need a plain and simple Bash script to do that for, say, forty or fifty of the .mp4 files.
If you have a list of file you can use something like: cat list-of-files.txt | while read file; do ffmpeg -i $file -codec copy ${file%%.mp4}.avi; done or simply cd /path/; ls *.mp4 | while read file; do ffmpeg -i $file -codec copy ${file%%.mp4}.avi; done
Batch Convert .mp4 to .avi with ffmpeg
1,479,208,613,000
I have a number of video files (500+) with lots of audio and subtitle streams for languages that I don't need and would thus like to remove to conserve storage space. I tinkered around with ffmpeg, but removing streams by processing one file after another turned out to very time consuming. Had no luck with scripting either as the video files contain different streams in different orders, which makes removal via index difficult and error prone. There must be a solution that is both faster and also works for files containing different streams, right? Any help would be much appreciated.
You could use the following ffmpeg command line: ffmpeg -i video.mkv -map 0:v -map 0:m:language:eng -codec copy video_2.mkv Explanation: -i video.mkv input file (identified as '0:' in mappings below) -map 0:v map video streams from input to output file -map 0:m:language:eng map streams for language 'eng' from input to output file (may be specified multiple times for multiple languages) -codec copy copy streams without reencoding video_2.mkv output file The resulting file will retain streams for English only (except for video streams which are copied regardless). I actually even created a script for this a while ago (GitHub Gist: remove-unneeded-languages.sh): #!/usr/bin/env bash # ------------------------------------------------------------------------- # - # Remove unneeded language(s) from video file - # - # Created by Fonic <https://github.com/fonic> - # Date: 04/14/22 - 08/26/22 - # - # Based on: - # https://www.reddit.com/r/ffmpeg/comments/r3dccd/how_to_use_ffmpeg_to_ - # detect_and_delete_all_non/ - # - # ------------------------------------------------------------------------- # Print normal/hilite/good/warn/error message [$*: message] function printn() { echo -e "$*"; } function printh() { echo -e "\e[1m$*\e[0m"; } function printg() { echo -e "\e[1;32m$*\e[0m"; } function printw() { echo -e "\e[1;33m$*\e[0m" >&2; } function printe() { echo -e "\e[1;31m$*\e[0m" >&2; } # Set up error handling set -ue; trap "printe \"Error: an unhandled error occurred on line \${LINENO}, aborting.\"; exit 1" ERR # Process command line if (( $# != 3 )); then printn "\e[1mUsage:\e[0m ${0##*/} LANGUAGES INFILE OUTFILE" printn "\e[1mExample:\e[0m ${0##*/} eng,spa video.mkv video_2.mkv" printn "\e[1mExample:\e[0m for file in *.mkv; do ${0##*/} eng,spa \"\${file}\" \"out/\${file}\"; done" printn "\e[1mNote:\e[0m LANGUAGES specifies language(s) to KEEP (comma-separated list)" exit 2 fi IFS="," read -a langs -r <<< "$1" infile="$2" outfile="$3" # Sanity checks [[ -f "${infile}" ]] || { printe "Error: input file '${infile}' does not exist, aborting."; exit 1; } command -v "ffmpeg" >/dev/null || { printe "Error: required command 'ffmpeg' is not available, aborting."; exit 1; } # Run ffmpeg printh "Processing file '${infile}'..." lang_maps=(); for lang in "${langs[@]}"; do lang_maps+=("-map" "0:m:language:${lang}"); done ffmpeg -i "${infile}" -map 0:v "${lang_maps[@]}" -codec copy -loglevel warning "${outfile}" || exit $? exit 0 Use it like this (e.g. to process all .mkv files and only keep streams for English and Spanish): mkdir out; for file in *.mkv; do remove-unneeded-languages.sh eng,spa "${file}" "out/${file}"; done
How to remove unneeded languages from video files using ffmpeg?
1,479,208,613,000
On Linux I have a process (ffmpeg) that writes very slowly (even slower than 1kb / s sometimes) to disk. Ffmpeg can buffer this to 256kb chunks that get written infrequently but ffmpeg hangs occasionally and if I try to detect these hangs by checking that the file is being updated I need to wait a long time between updates, up to 10 or 15 mins, otherwise I can sometimes mistakenly kill the ffmpeg process when it appears to have stopped writing when it fact its still filling its internal buffer. Theres no way to detect this it seems unless I use strace (that I can find anyway). So I am wondering about turning off buffering in ffmpeg and writing unbuffered to disk from ffmpeg. This will result in the disk constantly making tiny writes and wasting power (and probably, if I use a SSD, mess with wear levelling too). So I would like to make ffmpeg write to a 'virtual file' (in memory - either kernel memory or a process) which I can specify the flushing characteristics of. The idea being to perhaps specify flush every 2 minutes, then I can keep an eye on the file size and make sure its still being written. I don't think I've missed any other ways to do this job - even if I could watch the socket stream incoming to ffpmeg the process itself could still stop writing and lose data. Doing the buffering outside of ffmpeg seems like the best way. Is there a built in way to do this in Linux or does it mean a custom process? I guess I know how to do this with a small C program and pipe the data in but I wonder if theres a neater way.
You can use GNU dd for this; it was designed to reblock data when reading/writing tape drives optimally. Pipe the unbuffered output into, for example, dd obs=20k status=progress >/the/file where 20k is the output block size you wish to use for each write to disk. k means kibibytes. With status=progress you get a line on stderr which updates when a write is done, e.g. 20480 bytes (20 kB, 20 KiB) copied, 12 s, 1.7 kB/s If you prefer, omit this option and kill dd with signal SIGUSR1 when you want it to write out the progress.
Create a file for writing with controlled flushing to disk in large chunks
1,479,208,613,000
I have a large music collection. Some of it is lossless files, and some is lossy. I would like to maintain a copy of the collection that consists of the original collection's lossy files, and lossy transcodes of the original collections lossless files. Some assumptions: I know how to use ffmpeg to convert flac to opus files I only have flac files that need to be converted, no wav or alac codecs The lossy files can be opus, vorbis, or mp3 I want to: Use minimal storage for the the new music collection. I.e. it would link back to the original lossy files where appropriate. Keep the collection up to date as I add more lossy and lossless files to the original, or update the metadata. Not have to re-transcode lossless files that have not been modified. I imagine I will need to use some custom scripting to accomplish this, but if anyone has recommendations or tips before I sink lots of time into this, I would be eternally grateful.
I don't like Makefiles (I might concur with this guy); however, make does what you want, out of the box: You define a rule, for example, that you want an .opus file for every source .flac file: Makefile, from the top of my head TARGETDIR=/path/to/compressed/library %.opus: %.flac ffmpeg -ffmpegflags -and -stuff -i "$<" -o "$@" $(TARGETDIR)/%.opus: %.opus cp --reflink=always "$<" "$@" That could convert all your FLACs into OPUSes in-tree. And it will only do that if the .opus file isn't there yet, or older than the last change in the FLAC. I don't like it, because it happens in-tree, i.e. you don't end up with a clean "originals only" directory. At least use a cp that supports reflinks, on a file system that does, too, so that your copies are shallow (and don't actually need any space). It also doesn't deal with subdirectories gracefully, I think, you'll find Then, honestly, make's functionality here is really just: For each wildcard source file (%.flac), check whether the result (same file .opus) has been built already, and if not (or the build is older than the source file), do the build. That's a bit backwards, and also not complicated enough to depend on Make. So, shell scripting. I use zsh. And while I don't test what I write, I try to comment it: #!/usr/bin/zsh # Copyright 2022 Marcus Müller # SPDX-License-Identifier: BSD-3-Clause # Find the license text under https://spdx.org/licenses/BSD-3-Clause.html # set options: setopt null_glob # Don't fail if there's no file matching a pattern setopt no_case_glob # Don't care about case in matching TARGET_DIR=../compressed_library make_containing_dir() { target_dir="${1:h}" if [[ ! -d "${target_dir}" ]] ; then logger -p user.debug "Creating directory ${target_dir}" mkdir -p "${target_dir}" || logger -p user.err "can't mkdir ${target_dir}" } for compressed_source in **/*.{mp3,opus,vorbis,mp4} ; do if [[ -d "${compressed_source}" ]]; then continue # skip directories that happen to have a matching suffix fi logger -p user.debug "dealing with compressed source ${compressed_source}" target_file="${TARGET_DIR}/${compressed_source}" make_containing_dir "${target_file}" # -h : check whether target exists and is symlink if [[ ! -h "${target_file}" ]] ; then ln -s "$(pwd)/${compressed_source}" "${target_file}" \ || logger -p user.err "copying ${compressed_source} failed" fi done for uncompressed_source in **/*.flac ; do if [[ -d "${uncompressed_source}" ]]; then continue # skip directories that happen to have a matching suffix fi logger -p user.debug "dealing with uncompressed source ${compressed_source}" target_file="${TARGET_DIR}/${uncompressed_source%%.flac}.opus" # ^ strip the .flac suffix make_containing_dir "${target_file}" # /-- compare source file for "older than" # | target file; this returns !=0 if the source file # | is newer, or the target file nonexisting # \--------------------\ # | if [[ "${uncompressed_source}" -ot "${target_file}" ]]; then ffmpeg -loglevel=fatal \ -i "${uncompressed_source}" \ -b:a 96k \ "${target_file}" \ || logger -p user.err "transcoding ${uncompressed_source} failed" fi done This is very untested, but at least it logs to syslog (journalctl -xef is your friend)
Maintain parallel lossy and lossless music collections
1,479,208,613,000
My node application spawns ffmpeg processes. In htop, there are a bunch of ffmpeg processes I would have expected to have ended but they are still shown in htop. The threads in green are the process that is currently active. The ones in white are shown using memory, and the time column is not incrementing. Are these processes using any resources and should I be looking into what's causing these processes to now close cleanly?
Yes, they're using resources, though its hard to say how much; could be a very small amount. First thing to check for is just with ps, see if their status is Z (zombie). Which would mean they've exited, but you're not calling wait/waitpid/etc. on them. (Probably not, as I think Node handles this for you). Otherwise, they've probably got some RAM, some number of file descriptors (likely in both ffmpeg and Node), and of course a process in use; you will run out of all of those if your app is long-running and keeps leaking them. I suppose worst would be if it still has that HTTP connection open; that will consume resources on dar.fm too. Tools like ps, lsof, and even strace can help check on what resources those ffmpegs are using. Whether you should fix it... is something you'll have to decide based on how much it costs in programmer time to fix vs. extra hardware/operations time to manage.
Are these threads in htop using any resources?
1,479,208,613,000
I'm trying to launch a series of FFMPEG commands over SSH. My SSH client is the native openssh (7.7.2.1) client in Windows Server 2019 (ver 1809 build 17763.2114). The command I am trying to run is this: C:\Windows\System32\OpenSSH\ssh.exe -v [email protected] -n -t 'nohup /home/user1/ffmpeg/ffmpeg -f lavfi -i testsrc -f null - -nostdin -nostats -hide_banner -loglevel error &' Other stackexchange answers indicate that using the "nohup" and the "&" command will launch the command and release the console. The problem I'm facing is that the command runs, but the console does not release the session and keep the command running. If I append " \" to the end of the line, the session releases, but the command stops. If I force disconnect, then the ffmpeg command continues running, but I need a smooth way to release the process so my powershell script can continue running. I want to keep all these commands on one line. How can I connect, launch this command, disconnect and leave it running?
I found the answer quickly. I had to modify my own command by redirecting the console output. C:\Windows\System32\OpenSSH\ssh.exe -v [email protected] -n "nohup /home/user1/ffmpeg/ffmpeg -f lavfi -i testsrc -f null - -nostdin -nostats -hide_banner -loglevel error > /dev/null 2>&1 & " I will be able to replace /dev/null with another file to write the output to.
How do I use SSH to launch a nohup ffmpeg command and leave it running after disconnect?
1,479,208,613,000
ffplay can nicely open e.g. /dev/video0 and monitor the incoming video frames (e.g. you can watch TV on a TV card). Giving /dev/video to ffmpeg also makes it easy to encode the video. Is it possible to do both: get video frames onto the screen while also encoding them at the same time?
There are many ways. I usually copy the raw video stream to a ffplay instance with the help of tee: ffmpeg -hide_banner -loglevel error -f v4l2 -pixel_format yuyv422 -video_size 1280x960 -i /dev/video0 -c:v copy -f rawvideo - |\ tee >(ffplay -f rawvideo -pixel_format yuyv422 -video_size 1280x960 -) |\ ffmpeg -f rawvideo -pixel_format yuyv422 -video_size 1280x960 -i - -c:v libx264 -crf 21 -y /tmp/encoded.mp4 In case you want to see the raw input as well as the output, the example is a bit more convoluted: ffmpeg -hide_banner -loglevel error -f v4l2 -pixel_format yuyv422 -video_size 640x480 -r 25 -i /dev/video0 -c:v copy -f rawvideo - | \ tee >(ffplay -hide_banner -loglevel error -window_title "Input" -f rawvideo -pixel_format yuyv422 -video_size 640x480 -) | \ ffmpeg -hide_banner -loglevel error -f rawvideo -pixel_format yuyv422 -video_size 640x480 -i - -c:v libx264 -crf 21 -f h264 - | \ tee >(ffplay -hide_banner -loglevel error -window_title "Encoded" -f h264 -) | \ ffmpeg -f h264 -r 25 -i - -y /tmp/encoded.mp4 Be sure to specify matching pixel formats, resolutions and frame-rates.
ffmpeg: monitoring video being encoded from /dev/video* on screen
1,479,208,613,000
I'm trying to use ffmpeg's signature function to perform a duplicate analysis on several thousand video files that are listed in the text file vids.list. I need to have it so that every file is compared with every other file then that line of the list is removed. The following is what I have so far: #!/bin/bash home="/home/user/" declare -i lineno=0 while IFS="" read -r i; do /usr/bin/ffmpeg -hide_banner -nostats -i "${i}" -i "$(while IFS="" read -r f; do echo "${f}" done < ${home}/vids.list)" \ -filter_complex signature=detectmode=full:nb_inputs=2 -f null - < /dev/null let ++lineno sed -i "1 d" ${home}/vids.list done < vids.list 2> ${home}/out.log ffmpeg is outputting a "too many arguments" because the inside while loop is dumping all the filenames into the second -i. I'm not sure if I need a wait somewhere (or a formatting option) to hold to loop open while the top while loop finishes. Just to clarify, I would need the loop to start at line 1 of the text file with paths, compare that file with the file from line 2, 3, 4...2000 (or whatever), remove line 1, and continue.
Sidestepping the exact command, I take it you want something like this (with the obvious four-line input)? $ bash looploop.sh run ffmpeg with arguments 'alpha' and 'beta' run ffmpeg with arguments 'alpha' and 'charlie' run ffmpeg with arguments 'alpha' and 'delta' run ffmpeg with arguments 'beta' and 'charlie' run ffmpeg with arguments 'beta' and 'delta' run ffmpeg with arguments 'charlie' and 'delta' We already know how to make a loop, so let's just add another, nested inside the first. That by itself would match all input lines with themselves and all pairs twice, so count the lines to skip the pairs that will already have been processed. #!/bin/bash i=0 while IFS= read a; do i=$((i + 1)) j=0 while IFS= read b; do j=$((j + 1)) if [ "$j" -le "$i" ]; then continue; fi # insert the actual commands here printf "run ffmpeg with arguments '%s' and '%s'\n" "$a" "$b" done < vids.list done < vids.list Or like you did, removing the lines as they are processed by the outer loop, this is actually shorter: #!/bin/bash cp vids.list vids.list.tmp while IFS= read a; do while IFS= read b; do if [ "$a" = "$b" ]; then continue; fi # insert the actual commands here printf "run ffmpeg with arguments '%s' and '%s'\n" "$a" "$b" done < vids.list.tmp sed -i '1d' vids.list.tmp done < vids.list.tmp rm vids.list.tmp I'm not sure what exactly causes "too many arguments" in your script, but the argument to -i is a double-quoted string with just a command substitution inside, so it will be passed as a single argument to ffmpeg (with the newlines from the echo embedded). It shouldn't result in too many arguments.
Using nested while loops for ffmpeg processing
1,479,208,613,000
I call ffmpeg like this from Mac's terminal: $ find . -type f -name *.webm | while IFS= read -r f; do echo "$f"; ffmpeg -i "$f" "${f%.webm}".mp4 2> ~/Desktop/err; done Only the first file returned by find gets processed: ./artist/Moody Blues/_vid/Nights in white satin_lyrics.webm Excerpt from err: Enter command: |all |-1 [ ] Parse error, at least 3 arguments were expected, only 1 given in string 's [360p].webm' Enter command: |all |-1 [ ] Parse error, at least 3 arguments were expected, only 1 given in string 'hannel/Ash Wainman/_Inbox/BOHEMIAN RHAPSODY - ASH WAINMAN[HD,1280x720].webm' We should have 'channel' rather than 'hannel'.
You're improperly using find and needlessly creating a shell loop (it hurts to read!), because you can (should) run ffmpeg directly from inside find: find . -type f -name *.webm \ -exec sh -c 'echo "$1"; ffmpeg -nostdin -i "$1" "${1%.webm}".mp4 2>> ~/Desktop/err' sh {} ';' With deference to LordNeckBeard (though mine is assuredly hairier).
Invoking ffmpeg iteratively
1,479,208,613,000
I wrote a small script that converts HQ video to LQ: ls video/hq | cut -d. -f1 | while read line ; do HQ=./video/hq/$line.mp4 LQ=./video/lq/$line.mp4 ffmpeg -i $HQ -crf 40 $LQ done; When I run ls video/hq | cut -d. f1 I get back: 1502460615677 1502461135975 1502461292963 1502461373947 1502461493936 1502461782119 But when running the conversion script only 1502460615677 is processed. If I replace the line with ffmpeg with echo $line echo runs for all but with ffmpeg it only runs for one. Anyone know why ffmpeg changes how this runs?
Apparently ffmpeg reads from standard input, which interferes with the read command. So I'm directing to /dev/null ls video/hq | cut -d. -f1 | while read line ; do HQ=./video/hq/$line.mp4 LQ=./video/lq/$line.mp4 ffmpeg -i $HQ -crf 40 $LQ < /dev/null done;
ffmpeg script only running for first in list?
1,479,208,613,000
To merge two MP4 file, it's necessary to pass by .ts file. ffmpeg -i input1.mp4 -c copy -bsf:v h264_mp4toannexb -f mpegts input1.ts ffmpeg -i input2.mp4 -c copy -bsf:v h264_mp4toannexb -f mpegts input2.ts ffmpeg -i "concat:input1.ts|input2.ts" -c copy output.mp4 But, I have this error on first/second command: Codec 'mpeg4' (13) is not supported by the bitstream filter 'h264_mp4toannexb'. Supported codecs are: h264 (28) Error initializing bitstream filter: h264_mp4toannexb Have you got an idea?
You're trying to do it using the concat protocol which concatenates at the file level. Do you get better results if you try to concatenate via the demuxer? You would list your input files in a text file (mylist.txt) and then: ffmpeg -f concat -safe 0 -i mylist.txt -c copy output.mp4 The -safe 0 is not required if the paths are relative. This method and the one you tried are both suggested here on ffmpeg.org as well as various other methods of concatenation. Surely there is one there that will work for you? If all else fails you could transcode them to h264 first, which apparently they are not already in.
FFMPEG - Merge two MP4 files
1,479,208,613,000
The most simple slideshow is just a sequence of pictures with each being displayed for the same amount of time. I would also like to make a sequence, but more complicated with respect to the duration: I want the first picture to have a duration of 1 second, the next a little bit less than a second, the next even a little shorter, so that the sequence starts slow and gets faster towards the end. I do not know how to do it, the only thing I can guess is, that the duration for each picture could be calculated by a function like duration = 1/picture_number duration = 1 - (picture_number/picture_total) or in a script: duration=1 while [duration > 0] do duration=duration*0.99 done
This can be done in a single command. Basic method is to start with a slideshow where each image has the same duration and then use the select filter to trim out greater amounts from the display time of each successive image. ffmpeg -framerate 1 -i img%d.jpg \ -vf fps=25,select='lt(mod(t,1),1-floor(t)/25)',setpts=N/25/TB \ out.mp4 -framerate 1 --> this controls how fast the images originally turn over. fps=25 --> we set the final framerate. Also, need to do this to have frames to work with. select='lt(mod(t,1),1-floor(t/1)/25)' --> From each second, keep one frame less. setpts=N/25/TB --> make the timestamps of the selected frames continuous, else ffmpeg will duplicate frames to fill gaps and defeat our goal. To start with an initial duration of 3 seconds, you would change framerate to 1/3 and change select to lt(mod(t,3),3-floor(t/3)/25). You would need to change the 25 to control the speed at which t he duration is reduced. Higher values will reduce duration at a slower speed.
Is it possible to create with FFMPEG a series of pictures with continuously shorter duration?
1,479,208,613,000
I have an audiobook which consists of over 700 ra files (RealAudio) which I'm trying to batch convert to mp3 using ffmpeg. RA files are named as chapter-verse.ra (e.g. 13-01.ra) I run a script and it gets as far as processing 32 files and then stops. For each file it displays an error, but converts it none the less. Here is my script: #!/bin/bash # outdir=/data/sounds/output srcdir=/data/sounds # Cleanup first rm -f /data/sounds/output/* ls -1 ${srcdir}/*.ra | while read file do infile=$(basename $file) chapter=$(echo $infile | cut -f1 -d"-") verse=$(echo $infile | cut -f2 -d"-") verse=$(echo $verse | cut -f1 -d".") echo "File $file | Target: Chapter $chapter Verse $verse" echo ffmpeg -i $file -loglevel error -acodec libmp3lame ${outdir}/Chapter${chapter}_Verse${verse}.mp3 done Here is an extract of the output I'm getting: [ac3 @ 0x221e520] frame sync error Error while decoding stream #0:0: Invalid data found when processing input /data/sounds/13-02.ra: Input/output error File data/sounds/13-03.ra | Target: Chapter 13 Verse 03 data/sounds/13-03.ra: No such file or directory File /data/sounds/13-04.ra | Target: Chapter 13 Verse 04 [ac3 @ 0x1a4d520] frame sync error Error while decoding stream #0:0: Invalid data found when processing input /data/sounds/13-04.ra: Input/output error File data/sounds/13-05.ra | Target: Chapter 13 Verse 05 data/sounds/13-05.ra: No such file or directory File /data/sounds/13-06.ra | Target: Chapter 13 Verse 06 What's puzzling is that it complains about "no such file or directory", but if I simply echo the ffmpeg command (without executing it) the script runs fine all the way till the end. When the script aborts, I find my output directory has some files in it which all work fine, I just wish it would process all of them?! My environment: Fedora workstation 21 ffmpeg version 2.4.8 Copyright (c) 2000-2015 the FFmpeg developers
Figured this one out. This has already been answered here The ffmpeg line now reads: < /dev/null ffmpeg -i $file -loglevel error -acodec libmp3lame ${outdir}/Chapter${chapter}_Verse${verse}.mp3 And goes all the way till the end.
FFMPEG batch convert quits before processing all files
1,479,208,613,000
I'm looking for a way to convert all alac files in a directory recursively to another format. Problem: alac as well as aac use suffix m4a, so find -name "*.m4a" doesn't work.
You need a tool that can detect the codec in the m4a files. One such tool is avprobe which is available in debian based linuxes in package libav-tools (I use Ubuntu 14.04). Then you can do like this (if there are no newlines in file names..): find . -name \*.m4a | while read file; do avprobe "$file" 2>&1 |grep -q 'Audio: alac' && echo "$file"; done obviously substituting echo "$file" for the convert command. If it's not feasible to install such a tool, you can probably estimate it with grep -q Halac "$file" && grep -q '$alac' "$file". The few apple lossless files I tested contain those two strings, but it can match files which are not really apple lossless codec with a small probability, and I'm not quite sure apple lossless files are guaranteed to contain those strings. If that's okay, the above becomes find . -name \*.m4a | while read file; do grep -q Halac "$file" && grep -q '$alac' "$file" && echo "$file"; done
Find and convert all ALAC files to mp3 / opus / aac
1,371,492,830,000
I have a bash script which starts ffplay in background, looping a short sound sample indefinitely. Then it does some other work, and finally it kills ffplay. foo() { ffplay -loop 0 sound.wav &>/dev/null & trap "kill $!" RETURN (do work...) } I want ffplay to finish playing the currently played sample, and only then die. Is this possible?
You can get continuous statistics or debug output from ffplay which consists of lines ending carriage-return (^M) like this: nan M-A: nan fd= 0 aq= 0KB vq= 0KB sq= 0B f=0/0 ^M -0.07 M-A: -0.000 fd= 0 aq= 0KB vq= 0KB sq= 0B f=0/0 ^M -0.01 M-A: -0.000 fd= 0 aq= 0KB vq= 0KB sq= 0B f=0/0 ^M 0.02 M-A: 0.000 fd= 0 aq= 0KB vq= 0KB sq= 0B f=0/0 ^M It seems the first column is "nan", not-a-number, or negative at the start of a loop. You could check for this and kill the process when it is seen. The following bash script demonstrates this. #!/bin/bash sound=somefile.wav touch flagfile (sleep 4; rm -f flagfile) & ffplay -nodisp -stats -loop 0 "$sound" 2>&1 | while read -d $'\r' time rest do case $time in nan|-*|0.00) [ ! -f flagfile ] && exit ;; esac done It creates a file flagfile, which you need to remove to say you want to stop the loop. In this demo, it is just done after 4 seconds. ffplay is run with option -stats. The statistics are written on stderr so we redirect that to stdout with 2>&1. The statistics are piped to a while loop that uses read with delimiter carriage-return ($'\r') putting the first word in variable time and the rest in rest. The case compares the first word with the pattern nan or anything beginning with -, or the number 0, and if the flagfile no longer exists, the script exits. After a few more writes of statistics, ffplay will be killed with signal SIGPIPE. You will have to decide if this short delay is good enough for you. It depends how noisy the first few milliseconds of the sound file are. If your ffplay does not respond to SIGPIPE, you can run your own kill command by noting the process id of the command in a temporary file pidfile. Note the use of $BASHPID rather than $$, as the latter does not change within a sub-shell. #!/bin/bash sound=somefile.wav touch flagfile (sleep 4; rm -f flagfile) & ( echo $BASHPID >pidfile exec ffplay -nodisp -stats -loop 0 $sound 2>&1 ) | while read -d $'\r' time rest do case $time in nan|-*|0.00) [ ! -f flagfile ] && break ;; esac done kill -hup $(<pidfile) rm -f pidfile
FFPLAY - loop sound until signal received
1,371,492,830,000
I have a batch of sound samples which are too short (2.15 sec), and I want to extend the sustain to a total of about 10 seconds, meaning stretch the last 0.50 second of the file to 10 seconds. I can do this on each single file in audacity with paulstretch but was wondering if there's a way to do so in batch from the command line. Here is the original sample: original and here is the result I'd want: stretched
It's not possible to have sox stretch a sample by more than a factor of 10x directly. We could take an 0.5s sample and stretch it to 5s and then double that, but it gets complicated. Instead, I've chosen to take the last full second and stretch that. play original.wav trim 0 -1.0 : tempo -m 0.1 You can batch process exactly as you'd expect, by using a loop: for w in *.wav do sox "$w" "stretch_$w" trim 0 -1.0 : tempo -m 0.1 done
How to extend sustain in batch wav files?
1,371,492,830,000
I would like to know if there is any difference in the final output between the various command-line tools for encoding FLAC files, like ffmpeg, sox, the “official” flac etc. In some contexts, I have noticed that it's recommended to use flac over the others, but given that FLAC represents lossless encoding, am I correct in assuming that they should all produce identical output (given the same options)?
The FLAC encoder has a ton of parameters, so you'll need to consult with the source code of ffmpeg/sox to see how they use the codec but despite all of this does it really matter? FLAC is a lossless encoder, so even if flac, ffmpeg and sox produce different FLAC files, they will all decode bit perfectly. FFmpeg will produce a different output (header) as it adds itself to tags unless instructed otherwise.
FLAC encoders – any difference in output between the command-line tools?
1,371,492,830,000
I wish to convert the encoder of some audio files. The problem is that my car can't reproduce audio files encoded with LAME3.99.5; it's an issue with some Volvo cars. The problem is with USB and CD. The encoder needs to be LAME3.95 or less, or another encoder. What command should I use to achieve this? I would like to make it scriptable to encode a lot of files recursively. I'm trying SoX and ffmpeg with no luck.
Are you aware of the standalone "lame", which you can choose version to download: https://sourceforge.net/projects/lame/files/lame/3.95/ ... but then you need to compile it ;-p $ uname -a ... 20.04.1-Ubuntu ... x86_64 GNU/Linux $ tar -xvf lame-3.95.tar.gz $ cd lame-3.95/ $ ./configure 2>&1 > log.txt $ make all 2>&1 >> log.txt $ grep -E 'fail|erro' log.txt | wc -l 1 $ grep -E 'fail|erro' log.txt checking for library containing strerror... none required $ ./frontend/lame --help LAME version 3.95 (http://www.mp3dev.org/) usage: ./frontend/lame [options] <infile> [outfile] <infile> and/or <outfile> can be "-", which means stdin/stdout. RECOMMENDED: lame -h input.wav output.mp3 OPTIONS: -b bitrate set the bitrate, default 128 kbps -f fast mode (lower quality) -h higher quality, but a little slower. Recommended. -m mode (s)tereo, (j)oint, (m)ono default is (j) or (s) depending on bitrate -V n quality setting for VBR. default n=4 --preset type type must be "medium", "standard", "extreme", "insane", or a value for an average desired bitrate and depending on the value specified, appropriate quality settings will be used. "--preset help" gives more info on these --longhelp full list of options ... now you need to have your "mp3"-file as WAV as you're about to run this; I'd guess sox or ffmpeg can create that, one file at a time... I would like to make it scriptable to encode a lot of files recursively. Now, go to www.tldp.org and read the Bash guides, there is one for beginners, then another one named "Advanced".
Convert encode of audio files
1,371,492,830,000
This is the ffmpeg version I am working with on Debian testing - $ ffmpeg -version ffmpeg version 4.4.1-2+b1 Copyright (c) 2000-2021 the FFmpeg developers built with gcc 11 (Debian 11.2.0-12) configuration: --prefix=/usr --extra-version=2+b1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libdav1d --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librabbitmq --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libsrt --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzimg --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opencl --enable-opengl --enable-sdl2 --enable-pocketsphinx --enable-librsvg --enable-libmfx --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared libavutil 56. 70.100 / 56. 70.100 libavcodec 58.134.100 / 58.134.100 libavformat 58. 76.100 / 58. 76.100 libavdevice 58. 13.100 / 58. 13.100 libavfilter 7.110.100 / 7.110.100 libswscale 5. 9.100 / 5. 9.100 libswresample 3. 9.100 / 3. 9.100 libpostproc 55. 9.100 / 55. 9.100 This is the command I was trying to work with - $ ffmpeg -ss 00:22:20 -t 60 -i 123.mkv 456.mkv This comes from extracting/copying audio from a specific part of a video file, possible? which I had asked years ago. I also had a look at FFMpeg : Creating a video clip of approx. 10 seconds when video duration is unknown without audio- but if I try to use the timestamp as showed in that, it errors out saying - Invalid duration specification for ss: 00:22:20:0.0 The media file I am playing is around 50 odd minutes and I just want that one minute file 22:20 second and a minute after. Has something changed in ffmpeg or the way it processes things. I have obviously changed the file names. Nevertheless, this is what is it shows the file as if I put it under mkvinfo $ mkvinfo 456.mkv + EBML head |+ EBML version: 1 |+ EBML read version: 1 |+ Maximum EBML ID length: 4 |+ Maximum EBML size length: 8 |+ Document type: matroska |+ Document type version: 4 |+ Document type read version: 2 + Segment: size 6959454 |+ Seek head (subentries will be skipped) |+ EBML void: size 81 |+ Segment information | + Timestamp scale: 1000000 | + Title: 123 | + Multiplexing application: Lavf58.76.100 | + Writing application: Lavf58.76.100 | + Segment UID: 0x19 0xba 0x01 0xe0 0xed 0x6f 0x79 0xef 0xfb 0x9d 0xe6 0xcd 0x2b 0xad 0x2f 0x79 | + Duration: 00:23:54.905000000 |+ Tracks | + Track | + Track number: 1 (track ID for mkvmerge & mkvextract: 0) | + Track UID: 4989308985802999081 | + "Lacing" flag: 0 | + Name: abcd | + Language: und | + Codec ID: V_MPEG4/ISO/AVC | + Track type: video | + Default duration: 00:00:00.041708333 (23.976 frames/fields per second for a video track) | + Video track | + Pixel width: 1280 | + Pixel height: 720 | + Interlaced: 2 | + Video colour information | + Horizontal chroma siting: 1 | + Vertical chroma siting: 2 | + Codec's private data: size 45 (H.264 profile: High @L3.1) | + Track | + Track number: 2 (track ID for mkvmerge & mkvextract: 1) | + Track UID: 5858605359486045911 | + "Lacing" flag: 0 | + Name: abcd | + Language: eng | + Codec ID: A_VORBIS | + Track type: audio | + Audio track | + Channels: 2 | + Sampling frequency: 48000 | + Bit depth: 32 | + Codec's private data: size 3959 | + Track | + Track number: 3 (track ID for mkvmerge & mkvextract: 2) | + Track UID: 6757137498994684877 | + "Lacing" flag: 0 | + Name: abcd | + Language: eng | + Codec ID: S_TEXT/ASS | + Track type: subtitles | + Codec's private data: size 576 |+ Tags | + Tag | + Targets | + Simple | + Name: COMMENT | + String: abcd | + Simple | + Name: ENCODER | + String: Lavf58.76.100 | + Tag | + Targets | + Track UID: 4989308985802999081 | + Simple | + Name: BPS | + String: 1050683 | + Simple | + Name: BPS | + Tag language: eng | + String: 1050683 | + Simple | + Name: DURATION | + Tag language: eng | + String: 00:47:01.110000000 | + Simple | + Name: NUMBER_OF_FRAMES | + String: 67639 | + Simple | + Name: NUMBER_OF_FRAMES | + Tag language: eng | + String: 67639 | + Simple | + Name: NUMBER_OF_BYTES | + String: 370511729 | + Simple | + Name: NUMBER_OF_BYTES | + Tag language: eng | + String: 370511729 | + Simple | + Name: _STATISTICS_WRITING_APP | + String: mkvmerge v13.0.0 ('The Juggler') 64bit | + Simple | + Name: _STATISTICS_WRITING_APP | + Tag language: eng | + String: mkvmerge v13.0.0 ('The Juggler') 64bit | + Simple | + Name: _STATISTICS_WRITING_DATE_UTC | + String: 2018-09-20 14:51:30 | + Simple | + Name: _STATISTICS_WRITING_DATE_UTC | + Tag language: eng | + String: 2018-09-20 14:51:30 | + Simple | + Name: _STATISTICS_TAGS | + String: BPS DURATION NUMBER_OF_FRAMES NUMBER_OF_BYTES | + Simple | + Name: _STATISTICS_TAGS | + Tag language: eng | + String: BPS DURATION NUMBER_OF_FRAMES NUMBER_OF_BYTES | + Simple | + Name: ENCODER | + String: Lavc58.134.100 libx264 | + Simple | + Name: DURATION | + String: 00:01:00.022000000 | + Tag | + Targets | + Track UID: 5858605359486045911 | + Simple | + Name: BPS | + String: 640000 | + Simple | + Name: BPS | + Tag language: eng | + String: 640000 | + Simple | + Name: DURATION | + Tag language: eng | + String: 00:47:01.120000000 | + Simple | + Name: NUMBER_OF_FRAMES | + String: 88160 | + Simple | + Name: NUMBER_OF_FRAMES | + Tag language: eng | + String: 88160 | + Simple | + Name: NUMBER_OF_BYTES | + String: 225689600 | + Simple | + Name: NUMBER_OF_BYTES | + Tag language: eng | + String: 225689600 | + Simple | + Name: _STATISTICS_WRITING_APP | + String: mkvmerge v13.0.0 ('The Juggler') 64bit | + Simple | + Name: _STATISTICS_WRITING_APP | + Tag language: eng | + String: mkvmerge v13.0.0 ('The Juggler') 64bit | + Simple | + Name: _STATISTICS_WRITING_DATE_UTC | + String: 2018-09-20 14:51:30 | + Simple | + Name: _STATISTICS_WRITING_DATE_UTC | + Tag language: eng | + String: 2018-09-20 14:51:30 | + Simple | + Name: _STATISTICS_TAGS | + String: BPS DURATION NUMBER_OF_FRAMES NUMBER_OF_BYTES | + Simple | + Name: _STATISTICS_TAGS | + Tag language: eng | + String: BPS DURATION NUMBER_OF_FRAMES NUMBER_OF_BYTES | + Simple | + Name: ENCODER | + String: Lavc58.134.100 libvorbis | + Simple | + Name: DURATION | + String: 00:01:00.003000000 | + Tag | + Targets | + Track UID: 6757137498994684877 | + Simple | + Name: BPS | + String: 40 | + Simple | + Name: BPS | + Tag language: eng | + String: 40 | + Simple | + Name: DURATION | + Tag language: eng | + String: 00:46:08.244000000 | + Simple | + Name: NUMBER_OF_FRAMES | + String: 636 | + Simple | + Name: NUMBER_OF_FRAMES | + Tag language: eng | + String: 636 | + Simple | + Name: NUMBER_OF_BYTES | + String: 14113 | + Simple | + Name: NUMBER_OF_BYTES | + Tag language: eng | + String: 14113 | + Simple | + Name: _STATISTICS_WRITING_APP | + String: mkvmerge v13.0.0 ('The Juggler') 64bit | + Simple | + Name: _STATISTICS_WRITING_APP | + Tag language: eng | + String: mkvmerge v13.0.0 ('The Juggler') 64bit | + Simple | + Name: _STATISTICS_WRITING_DATE_UTC | + String: 2018-09-20 14:51:30 | + Simple | + Name: _STATISTICS_WRITING_DATE_UTC | + Tag language: eng | + String: 2018-09-20 14:51:30 | + Simple | + Name: _STATISTICS_TAGS | + String: BPS DURATION NUMBER_OF_FRAMES NUMBER_OF_BYTES | + Simple | + Name: _STATISTICS_TAGS | + Tag language: eng | + String: BPS DURATION NUMBER_OF_FRAMES NUMBER_OF_BYTES | + Simple | + Name: ENCODER | + String: Lavc58.134.100 ssa | + Simple | + Name: DURATION | + String: 00:23:54.905000000 |+ Cluster Now what I can infer from the above is that ffmpeg merely cut the video and hence shows properties from the old version including the time although the media file I created was around a minute. Is there anyway to do better, meaning it uses the ffmpeg version from today and gives the same, also the duration would change to one minute rather than the 23 minutes it is showing? FWIW, mkvmerge from my current system is - $ mkvmerge --veion mkvmerge v64.0.0 ('Willows') 64-bit
The sample you posted looks fine, yet the error message suggests there was an extra 0.0 appended to the start time when you executed the command. This should work: ffmpeg -ss 00:22:20 -t 60 -i 123.mkv 456.mkv
Extracting video via ffmpeg gives incorrect duration
1,371,492,830,000
I'm trying to create a script, which should read a video folder and create a list of video files to be processed by ffprobe to identify the codec. Videos NOT processed with a specific codec (in this case HEVC) should be put in new list for further processing by ffmpeg. I created a very rudimentary script, but hit a brick wall at a point where the variable ffprobe_input needs to be changed in order to be passed as the next input for ffprobe. Also, even if this part of the script was working, I'm puzzled as to how to create the filtered list of files after the ffprobe processing, since the only output is a single word, ex: hevc or x264. The actual script is below, alongside with my notes, which should be more descriptive, also in the notes are some of the ways I tried to make things work. This is the intended use of the script: ./script.sh -p /path\ to\ videos #!/bin/bash #Read path (-p) input and exit on error. while getopts p: flag do case "${flag}" in p) vpath=${OPTARG};; *) echo "usage: $0 [-p]" >&2 exit 1 ;; esac done #Now we echo the path for neatness echo -e "Selected root video path: $vpath"; #Check if the path is valid. The path must be escaped. Cd into the folder and execute: printf "%q\n" "$(pwd)" [ -d "$vpath" ] && echo "Directory $vpath exists." || echo "Error: Directory $vpath does not exist. Tip: make sure the spaces are escaped in folder names, ex: ===video\ folder===." #Prepare a list of video files with full escaped paths,ready for ffprobe/ffmpeg input. find "$vpath" -type f \( -iname "*.mkv" -o -iname "*.mp4" -o -iname "*.avi" \) | sed 's/ /\\ /g' >> full_list.txt #read the total number of lines from full_list.txt nrl_total="$(wc -l full_list.txt | grep -Eo "[0-9]{0,7}")" echo -e "There are a total of $nrl_total videos for further processing." #read line number and pass to $ffprobe_input # nrl=($(seq 1 "$nrl_total")) # nrl={1..$nrl_total..1} # for $nlr in {1..$nrl_total..1}; do # nrl=({1..$nrl_total..1}) filename='full_list.txt' nrl=1 while read line; do echo "$nrl" nrl=$((n+1)) #done < $filename #ffprobe_input="$(sed -n 1p full_list.txt)" Use line number in "p" attribute, ex: 1p. # ffprobe_input="$(sed -n 1p full_list.txt)" ffprobe_input="$(sed -n "$nrl"p full_list.txt)" #Now pass the input to ffprobe to determine if the videos are HEVC or not. Output is single word, ex: hevc or x264. eval ffprobe -v error -select_streams v:0 -show_entries stream=codec_name -of default=noprint_wrappers=1:nokey=1 -i "$ffprobe_input" done < $filename rm full_list.txt
Assuming your filenames don't contain newlines, you don't need to mangle them in any way. The output from file has one line per filename, so just store it and loop over the resulting file: > non-hevc.txt # clear the output list find "$vpath" -type f \( -iname "*.mkv" -o -iname "*.mp4" -o -iname "*.avi" \) \ > full_list.txt while IFS= read -r file; do result=$(ffprobe -v error -select_streams v:0 -show_entries \ stream=codec_name -of default=noprint_wrappers=1:nokey=1 -i "$file") if [ "$result" != hevc ]; then echo "$file" >> non-hevc.txt fi done < full_list.txt rm -f full_list.txt Here, the output of ffprobe is captured with the command substitution $(...) and stored to result, which we then look at. I don't see any reason for the dance with sed -n "$nrl"p inside the loop reading the filename list, since read already reads the same line. We do need IFS= and -r to not mangle the input, though. There's also no reason to escape any whitespace with backslashes, the quoted expansion of "$file" passes the contents of the variable as-is to the command. Undoing the escaping would also be difficult, when you use eval, it processes a lot of other stuff too, and would barf on e.g. parenthesis. Not sure if you want to append the output of find to whatever full_list.txt already contained, or recreate the list. Since we process the list immediately, it seem to me to make more sense to ignore any old contents. Note that like terdon comments, you don't strictly need the intermediate file to store the list of filenames. You could do just find ... | while IFS= read file, do ..., or with process substitution in Bash/ksh/zsh while IFS= read file, do ... done < <(find ...). The difference between the two matters if you want to set variables inside the while loop, see: Why is my variable local in one 'while read' loop, but not in another seemingly similar loop?
Problem with utilizing a "while loop" and subsequently processing data in a bash script
1,371,492,830,000
I have a video file which is 20 seconds long. I cut this video file into segments like video_file_0 -> starts at 0:00, ends at 0:02 video_file_1 -> starts at 0:02, ends at 0:04 video_file_2 -> starts at 0:04, ends at 0:06 video_file_3 -> starts at 0:06, ends at 0:08 video_file_4 -> starts at 0:08, ends at 0:10 video_file_5 -> starts at 0:10, ends at 0:12 video_file_6 -> starts at 0:12, ends at 0:14 video_file_7 -> starts at 0:14, ends at 0:16 video_file_8 -> starts at 0:16, ends at 0:18 video_file_9 -> starts at 0:18, ends at 0:20 So my question is, how can I play these video files continously in a single window exactly like playing the whole video file from 0:00 to 0:20, without closing and reopening windows in every switching between video files. Can I use ffplay, ffmpeg or vlcj for this functionality? I tried find -type f -name "video_file_*" | while read f; do ffplay -autoexit -- "$f"; done But this code closes and reopens the window between every video file, I don't want that. How can I do that? EDIT: I am building a Java project which the streams are shown inside the JFrame. So I want this functionality is shown inside the JFrame.
What about mpv --gapless-audio=yes --loop-playlist=inf video_file_* to enable Gapless playback as documented in the manual: --gapless-audio=<no|yes|weak> Try to play consecutive audio files with no silence or disruption at the point of file change. Default: weak. no: Disable gapless audio. yes: The audio device is opened using parameters chosen for the first file played and is then kept open for gapless playback. This means that if the first file for example has a low sample rate, then the following files may get resampled to the same low sample rate, resulting in reduced sound quality. If you play files with different parameters, consider using options such as --audio-samplerate and --audio-format to explicitly select what the shared output format will be. weak: Normally, the audio device is kept open (using the format it was first initialized with). If the audio format the decoder output changes, the audio device is closed and reopened. [...] along: --loop-playlist=<N|inf|force|no>, --loop-playlist Loops playback N times. A value of 1 plays it one time (default), 2 two times, etc. inf means forever. no is the same as 1 and disables looping. If several files are specified on command line, the entire playlist is looped. --loop-playlist is the same as --loop-playlist=inf. mpv also takes care of not creating any flicker in the video window. For a longer bu reasonable playlist video_file_* can be replaced (with adequate variant of sort) with $(printf '%s\n' video_file_* | sort -V), but that wouldn't handle spaces & other chars. So using find + xargs allows to handle any special character (which might still choke mpv itself) but would disrupt tty control for interactive control, so here it's artificially restored with </dev/tty: find -type f -name 'video_file_*' -print0 | sort -V -z | xargs -0 -- sh -c 'exec mpv </dev/tty --gapless-audio=yes --loop-playlist=inf -- "$@"' Note: recent versions of mpv already redirect /dev/tty back themselves. UPDATE: mpv has a --wid= option to embed itself into an other window. So if spawned from an application having prepared a window for this it could just be given by the application the parameter for the target window. From the manual: --wid=<ID> This tells mpv to attach to an existing window. If a VO is selected that supports this option, it will use that window for video output. mpv will scale the video to the size of this window, and will add black bars to compensate if the aspect ratio of the video is different. On X11, the ID is interpreted as a Window on X11. [...] So without requiring much code, on X11, using xwininfo to retrieve a window's Window id, eg: 0xdeadbeef, one can use mpv --wid=0xdeadbeef ... to embed it in that window. For example it works fine, including controls, on a simple xterm. Any more advanced use should probably involve embedding libmpv into other programs. Here are also pointers to X11's XReparentWindow (but mpv should be kept in charge of doing this): Embedding an X11 window, belonging to an independent process launched by me, into my own window? XLib: Reparenting a Java window with popups properly translated (but AFAIK, OP's goal is the other way around) xdotool has a windowreparent command.
How to play a playlist continously?
1,371,492,830,000
I used snd-aloop module to create a loopback audio stream. Now I want to somehow mux my desktop audio from pulseaudio and my microphone audio streams into this loopback stream, and while that can be done via ffmpeg, I can't find a way to write the output to the ALSA device.
To achieve this, all I had to do was to specify the format of the output "file" as alsa and set output to hw:[snd-aloop-card],1,0 Example: ffmpeg -i myfile.ogg -f alsa hw:2,1,0
Write audio stream to an ALSA device with ffmpeg
1,371,492,830,000
I am using ffmpeg to take screenshots of twitch streamers using this code: counter = 0 while counter <=5: os.system('ffmpeg -i ' + stream + f' -r 0.5 -f image2 {dir_path}/output_%09d.jpg') counter += 1 It's my first time working with ffmpeg so I thought all I needed was a while loop with a counter to control this but it literally takes thousands of screenshots. Is there a way to stop ffmpeg without stopping the entire script?
From man ffmpeg -vframes number (output) Set the number of video frames to output. This is an obsolete alias for "-frames:v", which you should use instead. An example based on your existing command: ffmpeg -i <stream> -r 0.5 -frames:v 5 -f image2 output_%05d.jpg -r 0.5 sets the rate, frames per seconds, you will be getting 1 frame every 2 seconds, from the beginning. Alternative you can also set -ss to start from a different time. -frames:v 5 you will exit after capturing 5 screenshots. You could also use 1 to take one screenshot and exit, if you execute this command into a loop, like in your example. But I think you would prefer to call one ffmpeg process for this (without loop).
How to run ffmpeg to take x number of screenshots?
1,371,492,830,000
I'm trying to switch from one advertised resolution/framerate to a different one on the fly, preferably while other applications are consuming the v4l2loopback feed. As an example, I feed a 1920x1080 black screen video into /dev/video2, and then open it in vlc. This works fine: $ ffmpeg -f lavfi -i color=c=black:s=1920x1080:r=25/1 -vcodec rawvideo -pix_fmt yuv420p -f v4l2 /dev/video2 $ ffmpeg -f v4l2 -list_formats all -i /dev/video2 ffmpeg version n4.3.1 Copyright (c) 2000-2020 the FFmpeg developers built with gcc 10.1.0 (GCC) configuration: --prefix=/usr --disable-debug --disable-static --disable-stripping --enable-avisynth --enable-fontconfig --enable-gmp --enable-gnutls --enable-gpl --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libdav1d --enable-libdrm --enable-libfreetype --enable-libfribidi --enable-libgsm --enable-libiec61883 --enable-libjack --enable-libmfx --enable-libmodplug --enable-libmp3lame --enable-libopencore_amrnb --enable-libopencore_amrwb --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-librav1e --enable-libsoxr --enable-libspeex --enable-libsrt --enable-libssh --enable-libtheora --enable-libv4l2 --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxcb --enable-libxml2 --enable-libxvid --enable-nvdec --enable-nvenc --enable-omx --enable-shared --enable-version3 libavutil 56. 51.100 / 56. 51.100 libavcodec 58. 91.100 / 58. 91.100 libavformat 58. 45.100 / 58. 45.100 libavdevice 58. 10.100 / 58. 10.100 libavfilter 7. 85.100 / 7. 85.100 libswscale 5. 7.100 / 5. 7.100 libswresample 3. 7.100 / 3. 7.100 libpostproc 55. 7.100 / 55. 7.100 [video4linux2,v4l2 @ 0x55864f9b06c0] Raw : yuv420p : Planar YUV 4:2:0 : 1920x1080 /dev/video2: Immediate exit requested However, killing the old feed and then streaming a different resolution into the device does not change the advertised capabilities, and just scrambles the screen in vlc. $ ffmpeg -f lavfi -i color=c=black:s=1280x800:r=25/1 -vcodec rawvideo -pix_fmt yuv420p -f v4l2 /dev/video2 # The list_formats options are still the same (only 1920x1080) # vlc shows a green instead of a black screen Is it possible to change this on the fly?
as long as the device is opened, it's resolution (and format) are fixed. so the answer to your question is: no, you can't change these settings on the fly. (unless you quit all consumers (in your case: VLC) before starting the new ffmpeg; but that's not what i would call "on the fly") this is a limitation of the V4L2 API, where you cannot signal a resolution/format change to any attached application. there also seems to be an issue with ffmpeg, which should either adjust the output frame-size if it cannot change it or refuse to run (it could also be a bug in the v4l2loopback module (not properly reporting back that the framesize didn't change), but with GStreamer it seems to work as expected, so I'm not sure about it)
How do I change the resolution/capabilities of a v4l2loopback device on the fly?
1,371,492,830,000
I can split an aid (or video) file by time, but how do I split it by file size? ffmpeg -i input.mp3 -ss S -to E -c copy output1.mp3 -ss S -to E -c copy output2.mp3 Which is fine if I have time codes, but if I want the output files to be split at 256MB regardless of the time length, what do I do? (What I am doing now is estimating, but that often means I have to make multiple runs at it with -ss S -to E to get files that are close to where I want in size).
The option you are looking for is -fs which limits the file size. here is the official documentation: https://ffmpeg.org/ffmpeg.html#Main-options
Use ffmpeg to split a file output by size
1,371,492,830,000
I'm trying to record lossless videos with ffmpeg of My screen My computer audio My microphone audio using this script: MIC="alsa_input.usb-Logitech_Logitech_USB_Headset-00.mono-fallback" MONITOR="alsa_output.usb-Logitech_Logitech_USB_Headset-00.analog-stereo.monitor" AUDIO0=$(pactl list short | grep "$MIC" | grep -Po "[0-9]+" | head -1) AUDIO1=$(pactl list short | grep "$MONITOR" | grep -Po "[0-9]+" | head -1) ffmpeg \ -video_size 1920x1080 \ -framerate 60 \ -f x11grab -i :0.0 \ -f pulse -i $AUDIO0 \ -f pulse -i $AUDIO1 \ -map 0 -map 1 -map 2 \ -c:a copy \ -c:v libx264rgb \ -crf 0 \ -preset ultrafast \ video.mkv On my slow computer it only records at about 7 FPS. Is there a way to record at a higher FPS while keeping it lossless and the file size fairly small? The 3 tracks also appear to be about a second out of sync with each other, with the screen first, the mic second, and the sound output third. I think it's because it recognizes them in that order when the recording is started. I can manually resync them, but it would be easier to fix the problem.
There are no faster presets for x264 than ultrafast, so you could: Reduce framerate from 60 to cinematic 24 or even 15 since we are talking about screen casting Use a different video codec Use hardware video encoding acceleration if your GPU supports it Add -thread_queue_size 1024 as encoding options. Some people say the output rate matters to have everything in sync, so try adding -r 60. I see no other options. Your computer is really slow by today's standards.
Lossless ffmpeg recordings with low resource usage
1,371,492,830,000
When doing an mp3 → mp3 (or flac → mp3) conversion, -map_metadata can be used to copy metadata from the input file to the output file: ffmpeg -hide_banner -loglevel warning -nostats -i "${source}" -map_metadata 0 -vn -ar 44100 -b:a 256k -f mp3 "${target}" However, when I use this, I notice that it doesn't copy all the metadata correctly. Using the tool eyeD3, I inspecting the input and output files see this: $ eyeD3 input.mp3 input.mp3 [ 4.15 MB ] -------------------------------------------------------------------------------- Time: 01:46 MPEG1, Layer III [ 320 kb/s @ 44100 Hz - Stereo ] -------------------------------------------------------------------------------- ID3 v2.3: title: Track title artist: Artist Name album: Album Name album artist: Various Artists composer: Composer Name recording date: 2019 eyed3.id3:WARNING: Non standard genre name: Soundtracks track: 17/37 genre: Soundtracks (id None) disc: 1/1 FRONT_COVER Image: [Size: 86555 bytes] [Type: image/jpeg] Description: PRIV: [Data: 42 bytes] Owner Id: Google/StoreId PRIV: [Data: 40 bytes] Owner Id: Google/StoreLabelCode -------------------------------------------------------------------------------- $ eyeD3 path/to/output.mp3 /tmp/test.mp3 [ 3.26 MB ] -------------------------------------------------------------------------------- Time: 01:46 MPEG1, Layer III [ 256 kb/s @ 44100 Hz - Stereo ] -------------------------------------------------------------------------------- ID3 v2.4: title: Track title artist: Artist Name album: Album Name album artist: Various Artists composer: Composer Name recording date: 2019 eyed3.id3:WARNING: Non standard genre name: Soundtracks track: 17/37 genre: Soundtracks (id None) disc: 1/1 PRIV: [Data: 40 bytes] Owner Id: Google/StoreLabelCode PRIV: [Data: 42 bytes] Owner Id: Google/StoreId -------------------------------------------------------------------------------- Specifically, it's not copying the FRONT_COVER image correctly - somehow it's being dropped along the way. How can I ensure that the FRONT_COVER Image is copied during the conversion process?
The front cover is treated as a video stream with a special disposition. Use of -vn will disable its processing. Use ffmpeg -hide_banner -loglevel warning -nostats -i "${source}" -map_metadata 0 -c:v copy -disposition:v:0 attached_pic -ar 44100 -b:a 256k -f mp3 "${target}"
ffmpeg not copying FRONT_COVER image metadata during conversion
1,371,492,830,000
I have a sequence of images, named from 00001.png to 00322.png. I want to create a video from that image sequence, for which I have used the following command: ffmpeg -i %05d.png -c:v libx264 -vf fps=100 -pix_fmt yuv420p triangles.mp4 The video renders correctly, but the length is of 13 seconds (according to vlc or youtube), when it should be of 3 seconds. Am I doing something wrong?
Image sequences have a framerate associated with them. When not specified, a default value of 25 is set. The fps filter converts a stream from its input framerate to the target framerate. However, it aims to preserve sync, so frames are dropped or duplicated while source frames are kept as close as possible to their source timestamp. All you need to do here is set a custom framerate for the image sequence, so ffmpeg -framerate 100 -i %05d.png -c:v libx264 -pix_fmt yuv420p triangles.mp4
ffmpeg video at 100 fps with 300 images gives 13 seconds
1,371,492,830,000
From the command line, I'd like to play a audio clip, or random subset of a song, eg, seconds 5 - 10. This does not seem to be a feature of paplay or mpg123, the two programs I've been using. ffmpeg allos me to trim a file (eg, ffmpeg -i file.mkv -ss 20 -to 40 -c copy file-2.mkv), but I'd like to avoid creating a new file. Piping the above does not seem to work, though I may be doing it wrong. How can I do this?
This approach works fine for me. Use force format option -f and select wave, write to stdout then pipe to e.g. aplay like so: ffmpeg -i input -ss 20 -to 40 -f wav - | aplay
How to randomly sample a subset of a song on command line
1,371,492,830,000
I am trying to make this script that loops through video files created between hours 00 and 12, convert them with ffmpeg and then remove them. The script works in terms of finding the files and starting ffpmeg but it seems that it continues to "send" characters from the find -exec after that ffmpeg has started on the first conversion and it eventually breaks ffmpeg and stop the conversion. How can I modify the script so this does not happen? The current script !/bin/bash -e find /videos/. -type f -print0 -exec sh -c 'h=$(date -d @$(stat -c %Y "$1") +%-H); [ "$h" -ge 00 ] && [ "$h" -lt 12 ]' sh {} \;|while read -d $'\0' i; do ffmpeg -y -i "$i" -vcodec libx264 -crf 27 -preset veryfast -movflags +faststart -c:a copy -threads 14 /output/"$(basename "$i" .ts)".mp4 rm -f -- "$i" done
Thanks to Gordon Davisson I managed to solve the problem. Here is the complete working script if someone happens to stumple upon this issue in the future. #!/bin/bash -e find /videos/. -type f -exec sh -c 'h=$(date -d @$(stat -c %Y "$1") +%-H); [ "$h" -ge 00 ] && [ "$h" -lt 12 ]' sh {} \; -print | while IFS= read -r i; do ffmpeg -y -i "$i" -vcodec libx264 -crf 27 -preset veryfast -movflags +faststart -c:a copy -threads 14 /output/"$(basename "$i" .ts)".mp4 </dev/null rm -f -- "$i" done
Creating a for loop with find -exec and while
1,371,492,830,000
I have 4 noname ip-cams and have troubles with capturing rtsp streams. Randomly output file not even created. I'm capturing stream via ffmpeg. Tried on Ubuntu Server 18.04 with snap ffmpeg and Debian 7 with ordinary ffmpeg. The same touble. (/snap/bin/ffmpeg -y -use_wallclock_as_timestamps 1 -hide_banner -loglevel trace -i rtsp://192.168.0.16$1:554/11 -c:v copy -an -flags +global_header $VIDEO_FILE 2> $LOG_FOLDER/cam$1 ) & Tried -loglevel debug but it show only successful RTSP connection and that's all. With -loglevel trace I have noticed difference between successful and failed process. Error: unable to open display Splitting the commandline. Reading option '-y' ... matched as option 'y' (overwrite output files) with argument '1'. Reading option '-use_wallclock_as_timestamps' ... matched as AVOption 'use_wallclock_as_timestamps' with argument '1'. Reading option '-hide_banner' ... matched as option 'hide_banner' (do not show program banner) with argument '1'. Reading option '-loglevel' ... matched as option 'loglevel' (set logging level) with argument 'trace'. Reading option '-i' ... matched as input url with argument 'rtsp://192.168.0.161:554/11'. Reading option '-c:v' ... matched as option 'c' (codec name) with argument 'copy'. Reading option '-an' ... matched as option 'an' (disable audio) with argument '1'. Reading option '-flags' ... matched as AVOption 'flags' with argument '+global_header'. Reading option '/home/melo/samba/video/19-03-07/cam1/16-06-11.mp4' ... matched as output url. Finished splitting the commandline. Parsing a group of options: global . Applying option y (overwrite output files) with argument 1. Applying option hide_banner (do not show program banner) with argument 1. Applying option loglevel (set logging level) with argument trace. Successfully parsed a group of options. Parsing a group of options: input url rtsp://192.168.0.161:554/11. Successfully parsed a group of options. Opening an input file: rtsp://192.168.0.161:554/11. Probing rtsp score:100 size:0 [tcp @ 0x1b5cc40] No default whitelist set [tcp @ 0x1b5cc40] Original list of addresses: [tcp @ 0x1b5cc40] Address 192.168.0.161 port 554 [tcp @ 0x1b5cc40] Interleaved list of addresses: [tcp @ 0x1b5cc40] Address 192.168.0.161 port 554 [tcp @ 0x1b5cc40] Starting connection attempt to 192.168.0.161 port 554 [tcp @ 0x1b5cc40] Successfully connected to 192.168.0.161 port 554 [rtsp @ 0x1b5abc0] Sending: OPTIONS rtsp://192.168.0.161:554/11 RTSP/1.0 CSeq: 1 User-Agent: Lavf58.20.100 -- [rtsp @ 0x1b5abc0] ret=1 c=52 [R] [rtsp @ 0x1b5abc0] ret=1 c=54 [T] [rtsp @ 0x1b5abc0] ret=1 c=53 [S] [rtsp @ 0x1b5abc0] ret=1 c=50 [P] [rtsp @ 0x1b5abc0] ret=1 c=2f [/] [rtsp @ 0x1b5abc0] ret=1 c=31 [1] [rtsp @ 0x1b5abc0] ret=1 c=2e [.] [rtsp @ 0x1b5abc0] ret=1 c=30 [0] [rtsp @ 0x1b5abc0] ret=1 c=20 [ ] [rtsp @ 0x1b5abc0] ret=1 c=32 [2] [rtsp @ 0x1b5abc0] ret=1 c=30 [0] Last message repeated 1 times [rtsp @ 0x1b5abc0] ret=1 c=20 [ ] [rtsp @ 0x1b5abc0] ret=1 c=4f [O] [rtsp @ 0x1b5abc0] ret=1 c=4b [K] [rtsp @ 0x1b5abc0] ret=1 c=0d [ ] [rtsp @ 0x1b5abc0] ret=1 c=0a [ ] [rtsp @ 0x1b5abc0] line='RTSP/1.0 200 OK' [rtsp @ 0x1b5abc0] ret=1 c=43 [C] [rtsp @ 0x1b5abc0] ret=1 c=53 [S] [rtsp @ 0x1b5abc0] ret=1 c=65 [e] [rtsp @ 0x1b5abc0] ret=1 c=71 [q] [rtsp @ 0x1b5abc0] ret=1 c=3a [:] [rtsp @ 0x1b5abc0] ret=1 c=20 [ ] [rtsp @ 0x1b5abc0] ret=1 c=31 [1] [rtsp @ 0x1b5abc0] ret=1 c=0d [ ] [rtsp @ 0x1b5abc0] ret=1 c=0a [ ] Good coding and everithing clear in successful process. Error: unable to open display Splitting the commandline. Reading option '-y' ... matched as option 'y' (overwrite output files) with argument '1'. Reading option '-use_wallclock_as_timestamps' ... matched as AVOption 'use_wallclock_as_timestamps' with argument '1'. Reading option '-hide_banner' ... matched as option 'hide_banner' (do not show program banner) with argument '1'. Reading option '-loglevel' ... matched as option 'loglevel' (set logging level) with argument 'trace'. Reading option '-i' ... matched as input url with argument 'rtsp://192.168.0.163:554/11'. Reading option '-c:v' ... matched as option 'c' (codec name) with argument 'copy'. Reading option '-an' ... matched as option 'an' (disable audio) with argument '1'. Reading option '-flags' ... matched as AVOption 'flags' with argument '+global_header'. Reading option '/home/melo/samba/video/19-03-07/cam3/16-06-11.mp4' ... matched as output url. Finished splitting the commandline. Parsing a group of options: global . Applying option y (overwrite output files) with argument 1. Applying option hide_banner (do not show program banner) with argument 1. Applying option loglevel (set logging level) with argument trace. Successfully parsed a group of options. Parsing a group of options: input url rtsp://192.168.0.163:554/11. Successfully parsed a group of options. Opening an input file: rtsp://192.168.0.163:554/11. Probing rtsp score:100 size:0 [tcp @ 0x18eec40] No default whitelist set [tcp @ 0x18eec40] Original list of addresses: [tcp @ 0x18eec40] Address 192.168.0.163 port 554 [tcp @ 0x18eec40] Interleaved list of addresses: [tcp @ 0x18eec40] Address 192.168.0.163 port 554 [tcp @ 0x18eec40] Starting connection attempt to 192.168.0.163 port 554 [tcp @ 0x18eec40] Successfully connected to 192.168.0.163 port 554 [rtsp @ 0x18ecbc0] Sending: OPTIONS rtsp://192.168.0.163:554/11 RTSP/1.0 CSeq: 1 User-Agent: Lavf58.20.100 -- [rtsp @ 0x18ecbc0] ret=1 c=00 [[rtsp @ 0x18ecbc0] ret=1 c=06 [?] [rtsp @ 0x18ecbc0] ret=1 c=08 [] [rtsp @ 0x18ecbc0] ret=1 c=00 [[rtsp @ 0x18ecbc0] ret=1 c=06 [?] [rtsp @ 0x18ecbc0] ret=1 c=04 [?] [rtsp @ 0x18ecbc0] ret=1 c=00 [[rtsp @ 0x18ecbc0] ret=1 c=01 [?] [rtsp @ 0x18ecbc0] ret=1 c=48 [H] [rtsp @ 0x18ecbc0] ret=1 c=02 [?] [rtsp @ 0x18ecbc0] ret=1 c=2a [*] [rtsp @ 0x18ecbc0] ret=1 c=e0 [à] [rtsp @ 0x18ecbc0] ret=1 c=bc [Œ] [rtsp @ 0x18ecbc0] ret=1 c=84 [] [rtsp @ 0x18ecbc0] ret=1 c=c0 [À] [rtsp @ 0x18ecbc0] ret=1 c=a8 [š] [rtsp @ 0x18ecbc0] ret=1 c=00 [[rtsp @ 0x18ecbc0] ret=1 c=a3 [£] [rtsp @ 0x18ecbc0] ret=1 c=ff [ÿ] Last message repeated 9 times [rtsp @ 0x18ecbc0] ret=1 c=00 [[rtsp @ 0x18ecbc0] ret=1 c=06 [?] [rtsp @ 0x18ecbc0] ret=1 c=08 [] [rtsp @ 0x18ecbc0] ret=1 c=00 [[rtsp @ 0x18ecbc0] ret=1 c=06 [?] [rtsp @ 0x18ecbc0] ret=1 c=04 [?] [rtsp @ 0x18ecbc0] ret=1 c=00 [[rtsp @ 0x18ecbc0] ret=1 c=01 [?] [rtsp @ 0x18ecbc0] ret=1 c=48 [H] [rtsp @ 0x18ecbc0] ret=1 c=02 [?] [rtsp @ 0x18ecbc0] ret=1 c=2a [*] [rtsp @ 0x18ecbc0] ret=1 c=e0 [à] [rtsp @ 0x18ecbc0] ret=1 c=bc [Œ] [rtsp @ 0x18ecbc0] ret=1 c=84 [] [rtsp @ 0x18ecbc0] ret=1 c=c0 [À] [rtsp @ 0x18ecbc0] ret=1 c=a8 [š] [rtsp @ 0x18ecbc0] ret=1 c=00 [[rtsp @ 0x18ecbc0] ret=1 c=a3 [£] [rtsp @ 0x18ecbc0] ret=1 c=ff [ÿ] Last message repeated 9 times [rtsp @ 0x18ecbc0] ret=1 c=00 [[rtsp @ 0x18ecbc0] ret=1 c=06 [?] [rtsp @ 0x18ecbc0] ret=1 c=08 [] [rtsp @ 0x18ecbc0] ret=1 c=00 [[rtsp @ 0x18ecbc0] ret=1 c=06 [?] [rtsp @ 0x18ecbc0] ret=1 c=04 [?] [rtsp @ 0x18ecbc0] ret=1 c=00 [[rtsp @ 0x18ecbc0] ret=1 c=01 [?] [rtsp @ 0x18ecbc0] ret=1 c=48 [H] [rtsp @ 0x18ecbc0] ret=1 c=02 [?] [rtsp @ 0x18ecbc0] ret=1 c=2a [*] [rtsp @ 0x18ecbc0] ret=1 c=e0 [à] [rtsp @ 0x18ecbc0] ret=1 c=bc [Œ] [rtsp @ 0x18ecbc0] ret=1 c=84 [] [rtsp @ 0x18ecbc0] ret=1 c=c0 [À] [rtsp @ 0x18ecbc0] ret=1 c=a8 [š] [rtsp @ 0x18ecbc0] ret=1 c=00 [[rtsp @ 0x18ecbc0] ret=1 c=a3 [£] [rtsp @ 0x18ecbc0] ret=1 c=ff [ÿ] Last message repeated 9 times [rtsp @ 0x18ecbc0] ret=1 c=00 [[rtsp @ 0x18ecbc0] ret=1 c=06 [?] [rtsp @ 0x18ecbc0] ret=1 c=08 [] [rtsp @ 0x18ecbc0] ret=1 c=00 [[rtsp @ 0x18ecbc0] ret=1 c=06 [?] [rtsp @ 0x18ecbc0] ret=1 c=04 [?] [rtsp @ 0x18ecbc0] ret=1 c=00 [[rtsp @ 0x18ecbc0] ret=1 c=01 [?] [rtsp @ 0x18ecbc0] ret=1 c=48 [H] [rtsp @ 0x18ecbc0] ret=1 c=02 [?] [rtsp @ 0x18ecbc0] ret=1 c=2a [*] [rtsp @ 0x18ecbc0] ret=1 c=e0 [à] [rtsp @ 0x18ecbc0] ret=1 c=bc [Œ] [rtsp @ 0x18ecbc0] ret=1 c=84 [] [rtsp @ 0x18ecbc0] ret=1 c=c0 [À] [rtsp @ 0x18ecbc0] ret=1 c=a8 [š] [rtsp @ 0x18ecbc0] ret=1 c=00 [[rtsp @ 0x18ecbc0] ret=1 c=a3 [£] [rtsp @ 0x18ecbc0] ret=1 c=ff [ÿ] Last message repeated 9 times [rtsp @ 0x18ecbc0] ret=1 c=00 [[rtsp @ 0x18ecbc0] ret=1 c=06 [?] [rtsp @ 0x18ecbc0] ret=1 c=08 [] [rtsp @ 0x18ecbc0] ret=1 c=00 [[rtsp @ 0x18ecbc0] ret=1 c=06 [?] [rtsp @ 0x18ecbc0] ret=1 c=04 [?] [rtsp @ 0x18ecbc0] ret=1 c=00 [[rtsp @ 0x18ecbc0] ret=1 c=01 [?] [rtsp @ 0x18ecbc0] ret=1 c=48 [H] [rtsp @ 0x18ecbc0] ret=1 c=02 [?] [rtsp @ 0x18ecbc0] ret=1 c=2a [*] [rtsp @ 0x18ecbc0] ret=1 c=e0 [à] [rtsp @ 0x18ecbc0] ret=1 c=bc [Œ] [rtsp @ 0x18ecbc0] ret=1 c=84 [] [rtsp @ 0x18ecbc0] ret=1 c=c0 [À] [rtsp @ 0x18ecbc0] ret=1 c=a8 [š] [rtsp @ 0x18ecbc0] ret=1 c=00 [[rtsp @ 0x18ecbc0] ret=1 c=a3 [£] [rtsp @ 0x18ecbc0] ret=1 c=ff [ÿ] Last message repeated 9 times [rtsp @ 0x18ecbc0] ret=1 c=00 [[rtsp @ 0x18ecbc0] ret=1 c=06 [?] [rtsp @ 0x18ecbc0] ret=1 c=08 [] [rtsp @ 0x18ecbc0] ret=1 c=00 [[rtsp @ 0x18ecbc0] ret=1 c=06 [?] [rtsp @ 0x18ecbc0] ret=1 c=04 [?] [rtsp @ 0x18ecbc0] ret=1 c=00 [[rtsp @ 0x18ecbc0] ret=1 c=01 [?] [rtsp @ 0x18ecbc0] ret=1 c=48 [H] [rtsp @ 0x18ecbc0] ret=1 c=02 [?] [rtsp @ 0x18ecbc0] ret=1 c=2a [*] [rtsp @ 0x18ecbc0] ret=1 c=e0 [à] [rtsp @ 0x18ecbc0] ret=1 c=bc [Œ] [rtsp @ 0x18ecbc0] ret=1 c=84 [] [rtsp @ 0x18ecbc0] ret=1 c=c0 [À] [rtsp @ 0x18ecbc0] ret=1 c=a8 [š] [rtsp @ 0x18ecbc0] ret=1 c=00 [[rtsp @ 0x18ecbc0] ret=1 c=a3 [£] [rtsp @ 0x18ecbc0] ret=1 c=ff [ÿ] Last message repeated 9 times Looks like a sh*t. Log files aren't full, but I think it's enough. That appears randomly with random cams. How to fix it?
I've found solution myself. I've got that problem was with packet authenticity. I've tried to add forced tcp connection flag (-rtsp_transport tcp) and it works. No problem anymore.
RTSP via ffmpeg
1,371,492,830,000
I have a large pcm file with about an hour's worth of data. I want to split it up into minute long chunks. Is there a way to ffmpeg to do that pr some other utility? Basically going from 0 - 3600s I want multiple files each going from 0 - 60s, 61-120s etc.
FFmpeg's segment muxer does this. ffmpeg -i in.wav -c copy -f segment -segment_time 60 out%d.wav This will create out0.wav, out1.wav, out2.wav ... , each 60 seconds long. If your input is raw PCM rather than WAV/AIFF, you'll need to manually set the input parameters e.g. ffmpeg -f s16le -channels 2 -ar 48000 -i in.pcm ...
Splitting up a pcm file into minute using ffmpeg long chunks
1,371,492,830,000
I want to cut video from long-video with ffmpeg, I use this command: ffmpeg -i /home/nantembo/VideoPerl/1.mp4 -f avi -vcodec copy -acodec copy -ss 0:14:47 -t 0:58:55 /home/nantembo/VideoPerl/2.mp4 but I receive video duration 58:55 min with start position 0:14:47 + 0:44:08, but I need to receive video, which: starting at 0:14:47 ending at 0:58:55 How I can do it?
According to the ffmpeg manual, the -t option is the duration, not the end time. I think you're looking for the -to option: -to position (output) Stop writing the output at position. position must be a time duration specification, see the Time duration section in the ffmpeg-utils(1) manual. -to and -t are mutually exclusive and -t has priority. So in your case, the command will be: ffmpeg -i /home/nantembo/VideoPerl/1.mp4 -f avi -vcodec copy -acodec copy -ss 0:14:47 -to 0:58:55 /home/nantembo/VideoPerl/2.mp4
How I can to cut line segment video with ffmpeg?
1,371,492,830,000
This is somewhat related to Play subtitles automatically with mpv I am running mpv 0.26.0-3 and trying for the media file to load subtitles but is failing although mediainfo shows that there is en/utf-8 text file for about 80 KB . The media file is in mkv format - Format : Matroska Format version : Version 4 / Version 2 File size : 699 MiB Duration : 2 h 15 min Overall bit rate mode : Variable Overall bit rate : 723 kb/s Movie name : TamilRockers.com Encoded date : UTC 2017-10-11 11:55:33 Writing application : mkvmerge v7.8.0 ('River Man') 64bit built on Mar 27 2015 16:31:37 Writing library : libebml v1.3.1 + libmatroska v1.4.2 Video ID : 1 Format : AVC Format/Info : Advanced Video Codec Format profile : [email protected] Format settings : CABAC / 4 Ref Frames Format settings, CABAC : Yes Format settings, RefFrames : 4 frames Codec ID : V_MPEG4/ISO/AVC Duration : 2 h 15 min Bit rate mode : Variable Bit rate : 627 kb/s Maximum bit rate : 40.0 Mb/s Width : 640 pixels Height : 272 pixels Display aspect ratio : 2.35:1 Frame rate mode : Constant Frame rate : 23.976 (24000/1001) FPS Color space : YUV Chroma subsampling : 4:2:0 Bit depth : 8 bits Scan type : Progressive Bits/(Pixel*Frame) : 0.150 Stream size : 606 MiB (87%) Writing library : x264 core 142 r2431 ac76440 Encoding settings : cabac=1 / ref=5 / deblock=1:0:0 / analyse=0x3:0x113 / me=umh / subme=8 / psy=1 / psy_rd=1.00:0.00 / mixed_ref=1 / me_range=16 / chroma_me=1 / trellis=1 / 8x8dct=1 / cqm=0 / deadzone=21,11 / fast_pskip=1 / chroma_qp_offset=-2 / threads=6 / lookahead_threads=1 / sliced_threads=0 / slices=4 / nr=0 / decimate=1 / interlaced=0 / bluray_compat=1 / constrained_intra=0 / bframes=3 / b_pyramid=1 / b_adapt=2 / b_bias=0 / direct=3 / weightb=1 / open_gop=1 / weightp=1 / keyint=24 / keyint_min=1 / scenecut=40 / intra_refresh=0 / rc_lookahead=24 / rc=2pass / mbtree=1 / bitrate=627 / ratetol=1.0 / qcomp=0.60 / qpmin=0 / qpmax=69 / qpstep=4 / cplxblur=20.0 / qblur=0.5 / vbv_maxrate=40000 / vbv_bufsize=30000 / nal_hrd=vbr / filler=0 / ip_ratio=1.40 / aq=1:1.00 Default : Yes Forced : No Audio ID : 2 Format : AAC Format/Info : Advanced Audio Codec Format profile : HE-AAC / LC Format settings : Explicit Codec ID : A_AAC-2 Duration : 2 h 15 min Bit rate : 93.8 kb/s Channel(s) : 6 channels Channel positions : Front: L C R, Side: L R, LFE Sampling rate : 48.0 kHz / 24.0 kHz Frame rate : 23.438 FPS (1024 SPF) Compression mode : Lossy Delay relative to video : 31 ms Stream size : 90.7 MiB (13%) Default : Yes Forced : No Text ID : 3 Format : UTF-8 Codec ID : S_TEXT/UTF8 Codec ID/Info : UTF-8 Plain Text Duration : 2 h 9 min Bit rate : 78 b/s Count of elements : 2491 Stream size : 74.3 KiB (0%) Default : No Forced : No Menu 00:00:00.000 : en:Chapter 01 00:02:05.267 : en:Chapter 02 00:05:09.367 : en:Chapter 03 00:10:03.400 : en:Chapter 04 00:22:55.734 : en:Chapter 05 00:34:40.668 : en:Chapter 06 00:44:35.035 : en:Chapter 07 00:58:14.802 : en:Chapter 08 01:10:22.502 : en:Chapter 09 01:14:09.669 : en:Chapter 10 01:22:36.236 : en:Chapter 11 01:30:17.736 : en:Chapter 12 01:35:45.570 : en:Chapter 13 01:41:16.837 : en:Chapter 14 01:56:03.705 : en:Chapter 15 01:59:13.306 : en:Chapter 16 02:11:47.606 : en:Chapter 17 I have not shared the hash or the filename for privacy reasons. but as can be seen there is this - Text ID : 3 Format : UTF-8 Codec ID : S_TEXT/UTF8 Codec ID/Info : UTF-8 Plain Text Duration : 2 h 9 min Bit rate : 78 b/s Count of elements : 2491 Stream size : 74.3 KiB (0%) Default : No Forced : No This is how my ~/.mpv/config is set up. ┌─[shirish@debian] - [~/.mpv] - [10033] └─[$] cat config 1 # Write your default config options here! 2 alang=eng,en,english,hin,hindi 3 slang=en,eng,english 4 sub-scale=1.25 I tried to toggle v while the media was playing but with no success. There are no subs. Toggling v says - a. Subtitles hidden b. subtitles visble (but no subtitles selected) How do I get out of this quagmire ?
the answer is - either adding --sid=1 or --sid=2 depending if there are one or more subtitles internally. the two flags are also convenient if you have an internal subtitle and an external subtitle and want to choose between the two as well.
unable to get mpv to play embedded subtitles even with config file setting on
1,371,492,830,000
I'm building a Docker image that enables OpenCV with ffmpeg support. My Dockerfile looks like RUN apt-get update && apt-get install -y \ git \ curl \ wget \ unzip \ ffmpeg \ build-essential \ cmake git pkg-config libswscale-dev \ libtbb2 libtbb-dev libjpeg-dev \ libpng-dev libtiff-dev libjasper-dev \ python3-numpy RUN cd \ && wget https://github.com/opencv/opencv/archive/3.1.0.zip \ && unzip 3.1.0.zip \ && cd opencv-3.1.0 \ && mkdir build \ && cd build \ && cmake -D CMAKE_BUILD_TYPE=RELEASE \ -D CMAKE_INSTALL_PREFIX=/usr/local \ -D WITH_CUDA=OFF \ -D INSTALL_C_EXAMPLES=OFF \ -D INSTALL_PYTHON_EXAMPLES=OFF \ -D WITH_FFMPEG=ON \ -D BUILD_NEW_PYTHON_SUPPORT=ON \ -D WITH_TBB=ON .. \ && make -j"$(nproc)" \ && make install \ && cd \ && rm 3.1.0.zip When I build my container ffmpeg is installed, but OpenCV does not detect it as I can clearly see from the general configuration: -- Video I/O: -- DC1394 1.x: NO -- DC1394 2.x: NO -- FFMPEG: NO -- codec: NO -- format: NO -- util: YES (ver 54.31.100) -- swscale: YES (ver 3.1.101) -- resample: NO -- gentoo-style: YES -- GStreamer: NO -- OpenNI: NO -- OpenNI PrimeSensor Modules: NO -- OpenNI2: NO -- PvAPI: NO -- GigEVisionSDK: NO -- UniCap: NO -- UniCap ucil: NO -- V4L/V4L2: NO/YES -- XIMEA: NO -- Xine: NO even if ffpmeg gets installed ffmpeg version 2.8.11-0ubuntu0.16.04.1 Copyright (c) 2000-2017 the FFmpeg developers built with gcc 5.4.0 (Ubuntu 5.4.0-6ubuntu1~16.04.4) 20160609 configuration: --prefix=/usr --extra-version=0ubuntu0.16.04.1 --build-suffix=-ffmpeg --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --cc=cc --cxx=g++ --enable-gpl --enable-shared --disable-stripping --disable-decoder=libopenjpeg --disable-decoder=libschroedinger --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmodplug --enable-libmp3lame --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-librtmp --enable-libschroedinger --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxvid --enable-libzvbi --enable-openal --enable-opengl --enable-x11grab --enable-libdc1394 --enable-libiec61883 --enable-libzmq --enable-frei0r --enable-libx264 --enable-libopencv
The ffmpeg binary package is for the ffmpeg command-line tool. The development headers you need to compile against are in different packages—a fair number. Thankfully, they all come from the ffmpeg source package, so you can get a list relatively easy: On Stretch, which uses actual ffmpeg: (stretch)$ grep-aptavail -s Package -F Source ffmpeg | grep -- '-dev$' Package: libavcodec-dev Package: libavdevice-dev Package: libavfilter-dev Package: libavformat-dev Package: libavresample-dev Package: libavutil-dev Package: libpostproc-dev Package: libswresample-dev Package: libswscale-dev Package: libffmpegthumbnailer-dev or on Jessie, where Debian used the libav fork: (jessie)$ grep-aptavail -s Package -F Source libav | grep -- '-dev$' Package: libavcodec-dev Package: libavdevice-dev Package: libavfilter-dev Package: libavformat-dev Package: libavresample-dev Package: libavutil-dev Package: libswscale-dev Package: libavc1394-dev Package: libavl-dev I don't have an Ubuntu box handy to check there. Note how the particular things OpenCV is saying "yes"/"no" to match those library names.
Docker image with OpenCV and FFPMEG
1,371,492,830,000
According to this article I used successfully the command $ ffmpeg -vf "select='eq(pict_type,I)'" -i somevideo.mp4 -vsync 0 -f image2 /tmp/thumbnails-%02d.jpg I tried the second command: $ ffmpeg -vf "select='gt(scene\,0.9)'" -i somevideo.mp4 -vsync 0 -f image2 /tmp/thumbnails-%02d.jpg but ended with error: Undefined constant or missing '(' in 'scene' Because $ ffmpeg -version ffmpeg version 0.8.17-4:0.8.17-0ubuntu0.12.04.1, Copyright (c) 2000-2014 the Libav developers built on Mar 16 2015 13:28:23 with gcc 4.6.3 The ffmpeg program is only provided for script compatibility and will be removed in a future release. It has been deprecated in the Libav project to allow for incompatible command line syntax improvements in its replacement called avconv (see Changelog for details). Please use avconv instead. ffmpeg 0.8.17-4:0.8.17-0ubuntu0.12.04.1 libavutil 51. 22. 3 / 51. 22. 3 libavcodec 53. 35. 0 / 53. 35. 0 libavformat 53. 21. 1 / 53. 21. 1 libavdevice 53. 2. 0 / 53. 2. 0 libavfilter 2. 15. 0 / 2. 15. 0 libswscale 2. 1. 0 / 2. 1. 0 libpostproc 52. 0. 0 / 52. 0. 0 I tried to use rather avconv. It runs both commands successfully, but in both cases it generates incorrect results (too many frames, so seemingly ignoring the video filter expression). How can I correct my ffmpeg or avconv to give the right results?
First of all, -vf needs to be specified after the input in order to affect it, and it seems to be the only reason avconv worked for the second command: it must have discarded your filter without even parsing it. If you move the argument after -i, it will result in the same error as ffmpeg gave you. Newer versions of ffmpeg actually treat that as an error. Now the reason why neither of the commands are working is simple: neither of the versions that you are using support the scene filter. What's more, the scene filter still appears to be missing in the master branch of avconv as of now, thus, avconv simply does not support it. As for ffmpeg, the filter was introduced in r7286814 which didn't make it into your build. Hence, you need to obtain an up-to-date version if you want to use the filter. Once installed, move -vf after -i, and run your command to get your results. $ ffmpeg -i somevideo.mp4 -vf "select='gt(scene,0.9)'" -vsync 0 -f image2 /tmp/thumbnails-%02d.jpg
Incorrect scene change detection with avconv
1,371,492,830,000
I have movie split in many parts with duration 10-30 seconds. All movies are MPEG TS files. I want to merge them. I try to make following: ffmpeg -f concat -i join.txt OUTPUT.TS and ffmpeg -i "concat:INPUT-1|INPUT-2" -c copy OUTPUT.TS both methods do the job, but resulting movie has issue if click somewhere forward or backward. Movie freeze, audio goes well and then it starts video. I suppose I should restore or create some additional frames to make movie smooth. What parameters should be passed to make result movie smooth with the same quality ?
What parameters should be passed to make result movie smooth with the same quality ? You probably need to regenerate the timestamps. Each clip has a separate timestamp stream, so when you concatenate them, the player perceives time as going backwards whenever it jumps from one part of the stream into a different one that was originally part of a different TS file. Try adding -fflags +genpts in there, just before the output file name. Incidentally, you want -c copy with the first command form as well. Otherwise, it may reencodes the files along the way.
FFMPEG glue MPEG TS
1,371,492,830,000
I'm trying to build ffmpeg with NVENC support so I can then build obs-studio with NVENC support, using this as a guide. I've sorted out every dependency after a bit of headache, and am now to the point where I should be able to compile ffmpeg with the edits to its rules file just fine. However, my trusty terminal spits the following out at me: http://pastebin.com/888pa3kW Here is my ffmpeg/debian/rules file: #!/usr/bin/make -f export V=1 # sets DEBIAN_VERSION variable include /usr/share/dpkg/pkg-info.mk # Get the Debian version revision: DEB_REVISION := $(word 2, $(subst -, ,$(DEB_VERSION))) # sets DEB_HOST_* variables include /usr/share/dpkg/architecture.mk # Ubuntu ld adds -Bsymbolic-functions by default, but that prevents FFmpeg from building. export DEB_LDFLAGS_MAINT_STRIP=-Wl,-Bsymbolic-functions # Package name for the extra flavor. EXTRA_PKGS := $(shell sed -nr 's/^Package:[[:space:]]*(.*extra[0-9]+)[[:space:]]*$$/\1/p' debian/control) FLAVORS = standard extra static # Enable as many features as possible, as long as the result is still GPLv2+ (a GPLv3+ variant is built as libavcodec-extra/libavfilter-extra flavor). # The following flags (and build-dependencies) are not added, because they would require a libavformat-extra flavor: # --enable-libsmbclient (libsmbclient-dev [!hurd-i386 !m68k !sparc64]) # The following flags are not added, because the necessary libraries are not in Debian: # --enable-decklink # --enable-libcelt (see #676592: removed from Debian as abandoned upstream, replaced by opus) # --enable-libdcadec # --enable-libilbc (see #675959 for the RFP bug) # --enable-libkvazaar # --enable-libmfx # --enable-libnut # --enable-libopenh264 # --enable-libopenmpt # --enable-libschroedinger (see #845037: removal due to security issues) # --enable-libutvideo # --enable-libvidstab (see #709193 for the RFP bug) # --enable-libxavs # --enable-libzimg # The following flags are not added for various reasons: # * --enable-librtmp: ffmpeg has better built-in RTMP support with listen mode. # * --enable-libv4l2 [!hurd-any]: This is only needed for very old devices and may cause problems for others. # Should anyone need it, using LD_PRELOAD pointing on libv4l2 has the same effect. # * --enable-opencl [!hurd-any]: This is considered an experimental API. CONFIG := --prefix=/usr \ --extra-version="$(DEB_REVISION)" \ --toolchain=hardened \ --libdir=/usr/lib/$(DEB_HOST_MULTIARCH) \ --incdir=/usr/include/$(DEB_HOST_MULTIARCH) \ --enable-gpl \ --disable-stripping \ --enable-avresample \ --enable-avisynth \ --enable-gnutls \ --enable-ladspa \ --enable-libass \ --enable-libbluray \ --enable-libbs2b \ --enable-libcaca \ --enable-libcdio \ --enable-libebur128 \ --enable-libflite \ --enable-libfontconfig \ --enable-libfreetype \ --enable-libfribidi \ --enable-libgme \ --enable-libgsm \ --enable-libmodplug \ --enable-libmp3lame \ --enable-libopenjpeg \ --enable-libopus \ --enable-libpulse \ --enable-librubberband \ --enable-libshine \ --enable-libsnappy \ --enable-libsoxr \ --enable-libspeex \ --enable-libssh \ --enable-libtheora \ --enable-libtwolame \ --enable-libvorbis \ --enable-libvpx \ --enable-libwavpack \ --enable-libwebp \ --enable-libx265 \ --enable-libxvid \ --enable-libzmq \ --enable-libzvbi \ --enable-omx \ --enable-openal \ --enable-opengl \ --enable-sdl2 \ --enable-nonfree \ --enable-nvenc # The standard configuration only uses the shared CONFIG. CONFIG_standard = --enable-shared # With these enabled, resulting binaries are effectively licensed as GPLv3+. CONFIG_extra = --enable-shared \ --enable-version3 \ --disable-doc \ --disable-programs \ --enable-libopencore_amrnb \ --enable-libopencore_amrwb \ --enable-libtesseract \ --enable-libvo_amrwbenc # The static libraries should not be built with PIC. CONFIG_static = --disable-pic \ --disable-doc \ --disable-programs # Disable optimizations if requested. ifneq (,$(filter $(DEB_BUILD_OPTIONS),noopt)) CONFIG += --disable-optimizations endif # Respect CC/CXX from the environment, if they differ from the default. # Don't set them if they equal the default, because that disables autodetection needed for cross-building. ifneq ($(CC),cc) CONFIG += --cc=$(CC) endif ifneq ($(CXX),g++) CONFIG += --cxx=$(CXX) endif # Some libraries are built only on linux. ifeq ($(DEB_HOST_ARCH_OS),linux) CONFIG += --enable-libdc1394 \ --enable-libiec61883 endif # Some build-dependencies are not installable on some architectures. ifeq (,$(filter $(DEB_HOST_ARCH),powerpcspe)) CONFIG_extra += --enable-netcdf endif # ffmpeg is involed in build-dependency cycles with opencv, x264 and chromaprint, so disable them in stage one. # Also disable frei0r, which build-depends on opencv. ifneq ($(filter stage1,$(DEB_BUILD_PROFILES)),) CONFIG += --disable-frei0r \ --disable-chromaprint \ --disable-libopencv \ --disable-libx264 else CONFIG += --enable-libopencv \ --enable-frei0r ifeq (,$(filter $(DEB_HOST_ARCH),powerpcspe)) CONFIG += --enable-libx264 endif ifeq (,$(filter $(DEB_HOST_ARCH),sh4)) CONFIG += --enable-chromaprint endif endif # Disable altivec optimizations on powerpc, because they are not always available on this architecture. ifeq ($(DEB_HOST_ARCH),powerpc) CONFIG += --disable-altivec # Build an altivec flavor of the libraries on powerpc. # This works around the problem that runtime cpu detection on powerpc currently does not work, # because, if altivec is enabled, all files are build with '-maltivec' so that the compiler inserts altivec instructions, wherever it likes. CONFIG_altivec = --enable-shared \ --enable-altivec \ --disable-doc \ --disable-programs CONFIG_altivec-extra = $(CONFIG_altivec) $(CONFIG_extra) FLAVORS += altivec altivec-extra endif # Disable assembly optimizations on x32, because they don't work (yet). ifneq (,$(filter $(DEB_HOST_ARCH),x32)) CONFIG += --disable-asm endif # Disable optimizations on mips(el) and some on mips64(el), because they are not always available on these architectures. ifneq (,$(filter $(DEB_HOST_ARCH),mips mipsel mips64 mips64el)) CONFIG += --disable-mipsdsp \ --disable-mipsdspr2 \ --disable-loongson3 \ --disable-mips32r6 \ --disable-mips64r6 endif ifneq (,$(filter $(DEB_HOST_ARCH),mips mipsel)) CONFIG += --disable-mipsfpu endif # Set cross-build prefix for compiler, pkg-config... # Cross-building also requires to manually set architecture/OS. ifneq ($(DEB_BUILD_GNU_TYPE),$(DEB_HOST_GNU_TYPE)) CONFIG += --cross-prefix=$(DEB_HOST_GNU_TYPE)- \ --arch=$(DEB_HOST_ARCH) \ --target-os=$(DEB_HOST_ARCH_OS) endif # Use the default debhelper scripts, where possible. %: dh $@ # Add configuration options: override_dh_auto_configure: $(foreach flavor,$(FLAVORS),mkdir -p debian/$(flavor);) $(foreach flavor,$(FLAVORS),set -e; echo " *** $(flavor) ***"; cd debian/$(flavor); ../../configure $(CONFIG) $(CONFIG_$(flavor)) || (cat config.log && exit 1); cd ../.. ;) touch override_dh_auto_configure # Remove the subdirectories generated for the flavors. override_dh_auto_clean: $(foreach flavor,$(FLAVORS),[ ! -d debian/$(flavor) ] || rm -r debian/$(flavor);) # Create doxygen documentation: override_dh_auto_build-indep: dh_auto_build -i --sourcedirectory=debian/standard -- apidoc # Create the minified CSS files. lessc debian/missing-sources/ffmpeg-web/src/less/style.less | cleancss > debian/standard/doc/style.min.css rm override_dh_auto_configure override_dh_auto_build-arch: # Copy built object files to avoid building them again for the extra flavor. # Build qt-faststart here, to make it possible to build with 'nocheck'. set -e && for flavor in $(FLAVORS); do \ echo " *** $$flavor ***"; \ if echo "$$flavor" | grep -q "extra"; then \ subdir=`[ "$$flavor" = "extra" ] && echo "debian/standard/" || echo "debian/altivec/"`; \ for dir in `cd ./$$subdir; find libavcodec libavdevice libavfilter libavformat libavresample libavutil libpostproc libswscale libswresample -type d`; do \ mkdir -p debian/"$$flavor"/"$$dir"; \ echo "$$subdir$$dir"/*.o | grep -q '*' || cp "$$subdir$$dir"/*.o debian/"$$flavor"/"$$dir"; \ done; \ rm debian/"$$flavor"/libavcodec/allcodecs.o; \ rm debian/"$$flavor"/libavfilter/allfilters.o; \ fi; \ if [ "$$flavor" = "standard" ]; then \ $(MAKE) -C debian/standard tools/qt-faststart; \ fi; \ dh_auto_build -a --sourcedirectory=debian/"$$flavor" || (cat debian/"$$flavor"/config.log && exit 1); \ done # Set the library path for the dynamic linker, because the tests otherwise don't find the libraries. override_dh_auto_test-arch: export LD_LIBRARY_PATH="libavcodec:libavdevice:libavfilter:libavformat:libavresample:libavutil:libpostproc:libswresample:libswscale"; \ dh_auto_test -a --sourcedirectory=debian/standard -- -k # No tests for indep build. override_dh_auto_test-indep: override_dh_auto_install-arch: dh_auto_install -a --sourcedirectory=debian/standard ifeq ($(DEB_HOST_ARCH),powerpc) install -d debian/tmp/usr/lib/$(DEB_HOST_MULTIARCH)/altivec install -m 644 debian/altivec/*/*.so.* debian/tmp/usr/lib/$(DEB_HOST_MULTIARCH)/altivec endif dh_auto_install -a --sourcedirectory=debian/extra --destdir=debian/tmp/extra ifeq ($(DEB_HOST_ARCH),powerpc) install -d debian/tmp/extra/usr/lib/$(DEB_HOST_MULTIARCH)/altivec install -m 644 debian/altivec-extra/*/*.so.* debian/tmp/extra/usr/lib/$(DEB_HOST_MULTIARCH)/altivec endif # Use the static libraries from the --disable-pic build install -m 644 debian/static/*/lib*.a debian/tmp/usr/lib/$(DEB_HOST_MULTIARCH) override_dh_auto_install-indep: dh_auto_install -i --sourcedirectory=debian/standard override_dh_install: dh_install $(addprefix -p,$(EXTRA_PKGS)) --sourcedir=debian/tmp/extra dh_install --remaining-packages override_dh_makeshlibs: set -e && for pkg in $(shell dh_listpackages -a) ; do \ case $$pkg in \ ffmpeg|*-dev) \ continue \ ;; \ *avcodec*) \ soversion=$$(echo $$pkg | sed -nr 's/^[^0-9]*([0-9]+)$$/\1/p'); \ dh_makeshlibs -p $$pkg -V"libavcodec$$soversion (>= ${DEB_VERSION_EPOCH_UPSTREAM}) | libavcodec-extra$$soversion (>= ${DEB_VERSION_EPOCH_UPSTREAM})" \ ;; \ *avfilter*) \ soversion=$$(echo $$pkg | sed -nr 's/^[^0-9]*([0-9]+)$$/\1/p'); \ dh_makeshlibs -p $$pkg -V"libavfilter$$soversion (>= ${DEB_VERSION_EPOCH_UPSTREAM}) | libavfilter-extra$$soversion (>= ${DEB_VERSION_EPOCH_UPSTREAM})" \ ;; \ *) \ dh_makeshlibs -p $$pkg -V \ ;; \ esac \ done # Don't compress the example source code files. override_dh_compress: dh_compress -Xexamples I am running Linux Lite 3.2 64bit, which is based on Ubuntu 16.04. I'm essentially completely new to this, so I'll need some hand holding.
I could build it without any issue in an LXC Ubuntu 16.04 container. vi /etc/apt/sources.list added source & backports repositories: deb http://archive.ubuntu.com/ubuntu xenial main restricted universe multiverse deb http://archive.ubuntu.com/ubuntu xenial-updates main restricted universe multiverse deb http://security.ubuntu.com/ubuntu xenial-security main restricted universe multiverse deb http://security.ubuntu.com/ubuntu xenial-backports main restricted universe multiverse deb-src http://archive.ubuntu.com/ubuntu xenial main restricted universe multiverse deb-src http://archive.ubuntu.com/ubuntu xenial-updates main restricted universe multiverse deb-src http://security.ubuntu.com/ubuntu xenial-security main restricted universe multiverse deb-src http://security.ubuntu.com/ubuntu xenial-backports main restricted universe multiverse Install dependencies and build tools including debhelper & dh-autoreconf from backports repo, use apt-cache policy ... to check their current version. apt update apt build-dep ffmpeg apt install git openssl ca-certificates devscripts dh-autoreconf=12~ubuntu16.04.1 debhelper=10.2.2ubuntu1~ubuntu16.04.1 libchromaprint-dev libebur128-dev libleptonica-dev libnetcdf-dev libomxil-bellagio-dev libopenjp2-7-dev librubberband-dev libsdl2-dev libtesseract-dev nasm Download source git clone https://anonscm.debian.org/git/pkg-multimedia/ffmpeg.git Modify rules file to add --enable-nonfree and --enable-nvenc cd ffmpeg/ echo 'libva 1 libva1' > debian/shlibs.local vi debian/rules: CONFIG :=... --enable-sdl2 \ --enable-nonfree \ --enable-nvenc Build it debuild -us -uc -b Here is the list for result debian packages. Reply to OP, for new error messages lintian is a QC tool for Debian packages, It just verify the result packages, but not effect on building process. Now running lintian... E: ffmpeg changes: bad-distribution-in-changes-file unstable W: libavdevice57: virtual-package-depends-without-real-package-depends depends: libgl1 N: 9 tags overridden (8 warnings, 1 info) Finished running lintian. However, if you want to correct that error message. That error raised because we copy source prepared for the unstable Debian release. Where is our case it should be xenial Ubuntu release. Run inside ffmpeg/ folder: dch To add new entry to debian/changelog and set it to xenial example: ffmpeg (7:3.2.2-1ubuntu1) xenial; urgency=medium * backport to xenial -- root <root@ci2> Wed, 28 Dec 2016 11:24:08 +0000 libavfilter-extra* is alternative to libavfilter* and they can't be installed together in same system. You have to choose depending your need (If you don't know, install extra) dpkg: regarding libavfilter-extra6_3.2.2-1_amd64.deb containing libavfilter-extra6:amd64: libavfilter-extra6 conflicts with libavfilter6 Other missing dependencies that are available in repo, like: ffmpeg-doc depends on libjs-bootstrap; however: Package libjs-bootstrap is not installed. could be downloaded using: sudo apt -f install
Help compiling ffmpeg with NVENC support under Linux
1,371,492,830,000
I use the following code to convert WAV to ALAC (bash, macOS 10.12.1): find . -type f -iname "*.wav" | while read fn; do ffmpeg -i "$fn" -acodec alac "${fn%.wav}.m4a"; done But there seems to be a mistake since it prints warnings like this: n---8085/03_Part_III.wav: No such file or directory The correct path would be: Bad_Religion/wav/Bad_Religion---8085/03_Part_III.wav For some reason the path is truncated. What's wrong with the command?
Your file names are not actually being truncated. Here, ffmpeg is trying to read commands from its input stream. Unfortunately, this is the same stream read is using to determine filenames, so it appears that parts of these filenames are not being read. To fix this, you should tell ffmpeg to disable interaction on the input stream with the -nostdin flag.
Problems converting WAV to ALAC by a batch job
1,371,492,830,000
I'm on a dedicated server with Root access. not familiar with servers. Im trying to install FFMpeg on my server. I'm getting errors can't figure out how to solve it. So any light on this will be very appreciated. [root@ns335004 ~]# yum update base | 3.6 kB 00:00:00 http://apt.sw.be/redhat/el7/en/x86_64/dag/repodata/repomd.xml: [Errno 14] HTTP Error 404 - Not Found Trying other mirror. To address this issue please refer to the below knowledge base article https://access.redhat.com/articles/1320623 If above article doesn't help to resolve this issue please create a bug on https://bugs.centos.org/ One of the configured repositories failed (DAG RPM Repository), and yum doesn't have enough cached data to continue. At this point the only safe thing yum can do is fail. There are a few ways to work "fix" this: 1. Contact the upstream for the repository and get them to fix the problem. 2. Reconfigure the baseurl/etc. for the repository, to point to a working upstream. This is most often useful if you are using a newer distribution release than is supported by the repository (and the packages for the previous distribution release still work). 3. Disable the repository, so yum won't use it by default. Yum will then just ignore the repository until you permanently enable it again or use --enablerepo for temporary usage: yum-config-manager --disable dag 4. Configure the failing repository to be skipped, if it is unavailable. Note that yum will try to contact the repo. when it runs most commands, so will have to try and fail each time (and thus. yum will be be much slower). If it is a very temporary problem though, this is often a nice compromise: yum-config-manager --save --setopt=dag.skip_if_unavailable=true failure: repodata/repomd.xml from dag: [Errno 256] No more mirrors to try. http://apt.sw.be/redhat/el7/en/x86_64/dag/repodata/repomd.xml: [Errno 14] HTTP Error 404 - Not Found repolist [root@ns335004 ~]# yum repolist all http://apt.sw.be/redhat/el7/en/x86_64/dag/repodata/repomd.xml: [Errno 14] HTTP Error 404 - Not Found Trying other mirror. To address this issue please refer to the below knowledge base article https://access.redhat.com/articles/1320623 If above article doesn't help to resolve this issue please create a bug on https://bugs.centos.org/ http://apt.sw.be/redhat/el7/en/x86_64/dag/repodata/repomd.xml: [Errno 14] HTTP Error 404 - Not Found Trying other mirror. repo id repo name status C7.0.1406-base/x86_64 CentOS-7.0.1406 - Base disabled C7.0.1406-centosplus/x86_64 CentOS-7.0.1406 - CentOSPlus disabled C7.0.1406-extras/x86_64 CentOS-7.0.1406 - Extras disabled C7.0.1406-fasttrack/x86_64 CentOS-7.0.1406 - CentOSPlus disabled C7.0.1406-updates/x86_64 CentOS-7.0.1406 - Updates disabled C7.1.1503-base/x86_64 CentOS-7.1.1503 - Base disabled C7.1.1503-centosplus/x86_64 CentOS-7.1.1503 - CentOSPlus disabled C7.1.1503-extras/x86_64 CentOS-7.1.1503 - Extras disabled C7.1.1503-fasttrack/x86_64 CentOS-7.1.1503 - CentOSPlus disabled C7.1.1503-updates/x86_64 CentOS-7.1.1503 - Updates disabled base/7/x86_64 CentOS-7 - Base enabled: 9,007 base-debuginfo/x86_64 CentOS-7 - Debuginfo disabled base-source/7 CentOS-7 - Base Sources disabled c7-media CentOS-7 - Media disabled centosplus/7/x86_64 CentOS-7 - Plus disabled centosplus-source/7 CentOS-7 - Plus Sources disabled cr/7/x86_64 CentOS-7 - cr disabled dag/7/x86_64 DAG RPM Repository enabled: 0 epel/x86_64 Extra Packages for Enterprise Linux 7 - x86_64 enabled: 10,764 epel-debuginfo/x86_64 Extra Packages for Enterprise Linux 7 - x86_64 - Debug disabled epel-source/x86_64 Extra Packages for Enterprise Linux 7 - x86_64 - Source disabled epel-testing/x86_64 Extra Packages for Enterprise Linux 7 - Testing - x86_64 disabled epel-testing-debuginfo/x86_64 Extra Packages for Enterprise Linux 7 - Testing - x86_64 - Debug disabled epel-testing-source/x86_64 Extra Packages for Enterprise Linux 7 - Testing - x86_64 - Source disabled extras/7/x86_64 CentOS-7 - Extras enabled: 393 extras-source/7 CentOS-7 - Extras Sources disabled fasttrack/7/x86_64 CentOS-7 - fasttrack disabled nux-dextop/x86_64 Nux.Ro RPMs for general desktop use disabled nux-dextop-testing/x86_64 Nux.Ro RPMs for general desktop use - testing disabled plesk-php-5.6 PHP v 5.6 for Plesk - x86_64 enabled: 31 plesk-php-7.0 PHP v 7.0 for Plesk - x86_64 enabled: 28 updates/7/x86_64 CentOS-7 - Updates enabled: 2,560 updates-source/7 CentOS-7 - Updates Sources disabled repolist: 22,783 I also tried: sudo yum clean metadata sudo yum clean all But still having same 404 Error. Thanks.
That's the old DAG repository, which is essentially obsolete. You should remove it, either by removing the repository setup package you installed originally, or by setting enabled=0 in the repository's file in /etc/yum.repos.d. The EL7 repo at ATrpms has builds of FFmpeg. They're pretty old, but probably suitable for most purposes. If you want the latest and greatest, you may have to build it from source. It isn't especially difficult these days.
Mirror not found when trying to install FFMPEG on CENTOS7
1,371,492,830,000
Source Community I've been trying to figure out an ffmpeg command with following requirements while converting 'avi' to 'mp4' with H264 video codecs. One command I tried was generic one like this which is recommended on most forums. ffmpeg -I input.avi -acodec copy -vcodec copy output.mp4 But this copies same video codec & doesn't convert to H264. Can anyone of you guys help me compose a line of code that would do the task with following requirements. => Video Options Codec: H264 Video  Aspect  Ratio: No Change Video  Resolution: No Change Video  FPS: No Change => Audio Options Codec: AC Audio  Channels: No Change Audio  Frequency: No Change Audio  Normalization: No Change Thanks in advance!
Let us enumerate the parameters to ffmpeg then. -acodec is better written -c:a (menmonic codec for audio) -vcodec is better written -c:v (same mnemonic) -i is the input file (not -I) ffmpeg does a pretty good guesswork based on file extensions, therefore doing: ffmpeg -i file.wem file.mp4 Will convert things, but probably in a pretty poor quality. For H264 you are after the libx264 codec therefore it should go: ffmpeg -i file.avi -c:v libx264 -c:a copy file.mp4 As a test let's use the classic webm example: $ wget http://video.webmfiles.org/big-buck-bunny_trailer.webm ... $ ffmpeg -i big-buck-bunny_trailer.webm -c:a copy -c:v libx264 bbb.mp4 ... $ ffprobe bbb.mp4 Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'bbb.mp4': Metadata: major_brand : isom minor_version : 512 compatible_brands: isomiso2avc1mp41 encoder : Lavf57.41.100 Duration: 00:00:32.50, start: 0.000000, bitrate: 414 kb/s Stream #0:0(eng): Video: h264 (High) (avc1 / 0x31637661), yuv420p, 640x360 [SAR 1:1 DAR 16:9], 341 kb/s, 25 fps, 25 tbr, 12800 tbn, 50 tbc (default) Metadata: handler_name : VideoHandler Stream #0:1(eng): Audio: vorbis (mp4a / 0x6134706D), 44100 Hz, mono, fltp, 64 kb/s (default) Metadata: handler_name : SoundHandler And that looks promising, stream #0:0 is a H264 encoded video stream.
Building Requirements-Specific Command For 'ffmpeg' Tool
1,436,039,802,000
I have ffmpeg with x11grab on my local machine, but I want to capture the screen of an X server on 12.34.56.78. How can I do that? The following doesn't work: ffmpeg -f x11grab -r 25 -s 800x600 -i 12.34.56.78:0.0 screen.avi
The remote X server must give you permission to contact him. The simplest solution were: xhost + ..given on the remote side. But warn, it enabled this grab thing for everybody & everywhere, which you probably won't. In this case a better solution were a xhost +1.2.3.4 ...where 1.2.3.4 is the ip from which you the remote X server contact. If you want to be very secure, you could use xauth as well, here you can find a tutorial to that (it is 2-3 commands or so).
x11grab from another display server
1,436,039,802,000
I encode a video file by using ffmpeg like this. $ ffmpeg -i input.avi -pass 1 -an output.mp4 $ ffmpeg -i input.avi -pass 2 -ab 128k -y output.mp4 So I typing always 2 times, is there way to encode a video by 2-pass at once? I change a options often and of course input and output file name is different each times.
Instead of running these as 2 separate commands you can run them on one command line like so: $ ffmpeg -i input.avi -pass 1 -an output.mp4 && \ ffmpeg -i input.avi -pass 2 -ab 128k -y output.mp4 The difference is the && notation which will run the second command (the 2nd pass) only if the first command was successful. They're still 2 separate operations, but this will allow you to run one command line vs. the 2 you were having to do previously. Also this will have the benefit of running the 2nd pass immediately upon completion of the 1st pass, where with your way you'd have to essentially wait for the 1st to finish before kicking off the 2nd.
2pass encoding by ffmpeg at once
1,436,039,802,000
When installing ZoneMinder 1.25.0 in CentOS 6.4 (64-bit) the following error pops up when executing make: zm_ffmpeg_camera.cpp:105:44: error: missing binary operator before token "(" Full log: zm_ffmpeg_camera.cpp:105:44: error: missing binary operator before token "(" In file included from zm_ffmpeg_camera.cpp:24: zm_ffmpeg_camera.h:39: error: ISO C++ forbids declaration of ‘AVFormatContext’ with no type zm_ffmpeg_camera.h:39: error: expected ‘;’ before ‘*’ token zm_ffmpeg_camera.h:41: error: ISO C++ forbids declaration of ‘AVCodecContext’ with no type zm_ffmpeg_camera.h:41: error: expected ‘;’ before ‘*’ token zm_ffmpeg_camera.h:42: error: ISO C++ forbids declaration of ‘AVCodec’ with no type zm_ffmpeg_camera.h:42: error: expected ‘;’ before ‘*’ token zm_ffmpeg_camera.h:44: error: ISO C++ forbids declaration of ‘AVFrame’ with no type zm_ffmpeg_camera.h:44: error: expected ‘;’ before ‘*’ token zm_ffmpeg_camera.h:45: error: ISO C++ forbids declaration of ‘AVFrame’ with no type zm_ffmpeg_camera.h:45: error: expected ‘;’ before ‘*’ token zm_ffmpeg_camera.cpp: In constructor ‘FfmpegCamera::FfmpegCamera(int, const std::string&, int, int, int, int, int, int, int, bool)’: zm_ffmpeg_camera.cpp:35: error: ‘mFormatContext’ was not declared in this scope zm_ffmpeg_camera.cpp:37: error: ‘mCodecContext’ was not declared in this scope zm_ffmpeg_camera.cpp:38: error: ‘mCodec’ was not declared in this scope zm_ffmpeg_camera.cpp:40: error: ‘mRawFrame’ was not declared in this scope zm_ffmpeg_camera.cpp:41: error: ‘mFrame’ was not declared in this scope zm_ffmpeg_camera.cpp: In destructor ‘virtual FfmpegCamera::~FfmpegCamera()’: zm_ffmpeg_camera.cpp:46: error: ‘mFrame’ was not declared in this scope zm_ffmpeg_camera.cpp:46: error: ‘av_freep’ was not declared in this scope zm_ffmpeg_camera.cpp:47: error: ‘mRawFrame’ was not declared in this scope zm_ffmpeg_camera.cpp:51: error: ‘sws_freeContext’ was not declared in this scope zm_ffmpeg_camera.cpp:54: error: ‘mCodecContext’ was not declared in this scope zm_ffmpeg_camera.cpp:56: error: ‘avcodec_close’ was not declared in this scope zm_ffmpeg_camera.cpp:59: error: ‘mFormatContext’ was not declared in this scope zm_ffmpeg_camera.cpp:61: error: ‘av_close_input_file’ was not declared in this scope zm_ffmpeg_camera.cpp: In member function ‘void FfmpegCamera::Initialise()’: zm_ffmpeg_camera.cpp:78: error: ‘AV_LOG_DEBUG’ was not declared in this scope zm_ffmpeg_camera.cpp:78: error: ‘av_log_set_level’ was not declared in this scope zm_ffmpeg_camera.cpp:80: error: ‘AV_LOG_QUIET’ was not declared in this scope zm_ffmpeg_camera.cpp:80: error: ‘av_log_set_level’ was not declared in this scope zm_ffmpeg_camera.cpp:82: error: ‘av_register_all’ was not declared in this scope zm_ffmpeg_camera.cpp: In member function ‘virtual int FfmpegCamera::PrimeCapture()’: zm_ffmpeg_camera.cpp:94: error: ‘mFormatContext’ was not declared in this scope zm_ffmpeg_camera.cpp:94: error: ‘av_open_input_file’ was not declared in this scope zm_ffmpeg_camera.cpp:95: error: ‘errno’ was not declared in this scope zm_ffmpeg_camera.cpp:98: error: ‘mFormatContext’ was not declared in this scope zm_ffmpeg_camera.cpp:98: error: ‘av_find_stream_info’ was not declared in this scope zm_ffmpeg_camera.cpp:99: error: ‘errno’ was not declared in this scope zm_ffmpeg_camera.cpp:103: error: ‘mFormatContext’ was not declared in this scope zm_ffmpeg_camera.cpp:108: error: ‘CODEC_TYPE_VIDEO’ was not declared in this scope zm_ffmpeg_camera.cpp:118: error: ‘mCodecContext’ was not declared in this scope zm_ffmpeg_camera.cpp:118: error: ‘mFormatContext’ was not declared in this scope zm_ffmpeg_camera.cpp:121: error: ‘mCodec’ was not declared in this scope zm_ffmpeg_camera.cpp:121: error: ‘avcodec_find_decoder’ was not declared in this scope zm_ffmpeg_camera.cpp:125: error: ‘mCodec’ was not declared in this scope zm_ffmpeg_camera.cpp:125: error: ‘avcodec_open’ was not declared in this scope zm_ffmpeg_camera.cpp:129: error: ‘mRawFrame’ was not declared in this scope zm_ffmpeg_camera.cpp:129: error: ‘avcodec_alloc_frame’ was not declared in this scope zm_ffmpeg_camera.cpp:132: error: ‘mFrame’ was not declared in this scope zm_ffmpeg_camera.cpp:135: error: ‘PIX_FMT_RGB24’ was not declared in this scope zm_ffmpeg_camera.cpp:135: error: ‘avpicture_get_size’ was not declared in this scope zm_ffmpeg_camera.cpp:138: error: ‘AVPicture’ was not declared in this scope zm_ffmpeg_camera.cpp:138: error: expected primary-expression before ‘)’ token zm_ffmpeg_camera.cpp:138: error: ‘avpicture_fill’ was not declared in this scope zm_ffmpeg_camera.cpp:141: error: ‘SWS_BICUBIC’ was not declared in this scope zm_ffmpeg_camera.cpp:141: error: ‘sws_getCachedContext’ was not declared in this scope zm_ffmpeg_camera.cpp: In member function ‘virtual int FfmpegCamera::Capture(Image&)’: zm_ffmpeg_camera.cpp:159: error: ‘AVPacket’ was not declared in this scope zm_ffmpeg_camera.cpp:159: error: expected ‘;’ before ‘packet’ zm_ffmpeg_camera.cpp:163: error: ‘mFormatContext’ was not declared in this scope zm_ffmpeg_camera.cpp:163: error: ‘packet’ was not declared in this scope zm_ffmpeg_camera.cpp:163: error: ‘av_read_frame’ was not declared in this scope zm_ffmpeg_camera.cpp:172: error: ‘mCodecContext’ was not declared in this scope zm_ffmpeg_camera.cpp:172: error: ‘mRawFrame’ was not declared in this scope zm_ffmpeg_camera.cpp:172: error: ‘avcodec_decode_video2’ was not declared in this scope zm_ffmpeg_camera.cpp:182: error: ‘mRawFrame’ was not declared in this scope zm_ffmpeg_camera.cpp:182: error: ‘mCodecContext’ was not declared in this scope zm_ffmpeg_camera.cpp:182: error: ‘mFrame’ was not declared in this scope zm_ffmpeg_camera.cpp:182: error: ‘sws_scale’ was not declared in this scope zm_ffmpeg_camera.cpp:188: error: ‘mCodecContext’ was not declared in this scope zm_ffmpeg_camera.cpp:188: error: ‘mFrame’ was not declared in this scope zm_ffmpeg_camera.cpp:193: error: ‘av_free_packet’ was not declared in this scope make[2]: *** [zm_ffmpeg_camera.o] Error 1 make[2]: Leaving directory `/root/cam/ZoneMinder-1.25.0/src' make[1]: *** [all-recursive] Error 1 make[1]: Leaving directory `/root/cam/ZoneMinder-1.25.0' make: *** [all] Error 2
Turns out the latest stable release of ffmpeg (1.2.2) does not go along with ZoneMinder 1.25.0. Installing the 0.9 version of ffmpeg solved this issue. wget http://www.ffmpeg.org/releases/ffmpeg-0.9.tar.gz tar -xzvf ffmpeg-0.9.tar.gz cd ffmpeg-0.9 ./configure --enable-gpl --enable-shared --enable-pthreads make make install make install-libs
ZoneMinder compiling error: "missing binary operator before token "(""
1,436,039,802,000
When I want to convert a video with the following command: ffmpeg -i file.wmv -sameq file.mpg it ends with this error message: FFmpeg version SVN-r0.5.1-4:0.5.1-1ubuntu1.1, Copyright (c) 2000-2009 Fabrice Bellard, et al. configuration: --extra-version=4:0.5.1-1ubuntu1.1 --prefix=/usr --enable-avfilter --enable-avfilter-lavf --enable-vdpau --enable-bzlib --enable-libgsm --enable-libschroedinger --enable-libspeex --enable-libtheora --enable-libvorbis --enable-pthreads --enable-zlib --disable-stripping --disable-vhook --enable-runtime-cpudetect --enable-gpl --enable-postproc --enable-swscale --enable-x11grab --enable-libdc1394 --enable-shared --disable-static libavutil 49.15. 0 / 49.15. 0 libavcodec 52.20. 1 / 52.20. 1 libavformat 52.31. 0 / 52.31. 0 libavdevice 52. 1. 0 / 52. 1. 0 libavfilter 0. 4. 0 / 0. 4. 0 libswscale 0. 7. 1 / 0. 7. 1 libpostproc 51. 2. 0 / 51. 2. 0 built on Mar 31 2011 18:53:20, gcc: 4.4.3 [wmv3 @ 0x82dbc50]Extra data: 8 bits left, value: 0 Seems stream 1 codec frame rate differs from container frame rate: 1000.00 (1000/1) -> 30.00 (30/1) Input #0, asf, from 'file.wmv': Duration: 00:16:53.68, start: 3.065000, bitrate: 3827 kb/s Stream #0.0: Audio: 0x0162, 44100 Hz, stereo, s16, 48 kb/s Stream #0.1: Video: wmv3, yuv420p, 1920x1080, 3750 kb/s, 30 tbr, 1k tbn, 1k tbc Could not open 'file.mpg' What is wrong? How can I fix it?
The problem lies here Audio: 0x0162 It's the audio format identifier called TwoCC, looking up there we find 0x0162 Windows Media Audio Professional V9 This codec is just not supported by your version of ffmpeg, according to the ffmpeg home page the WMA pro decoder was added in ffmpeg version 0.6, check it out. HTH
ffmpeg could not open file.mpg
1,436,039,802,000
I have some Bluray disks I am attempting to rip video from. Normally I'd use ffmpeg and select a playlist to rip and be done with it. With these discs, however, the videos make use of the alternate camera angles feature. My understanding is that both camera angles are encoded into a single video stream. The video codec on disc is VC-1. I have tried just ripping the playlist as usual. On my current machine, and playback with mpv simply will show one camera angle at a snail's pace (no hw-accelerated VC-1 decode). Re-encoding to another format such as FFV1 will play at full speed, but again, the one camera angle. My goal is to rip these videos with the camera angle of my choice using open source software. I've tried opening up an .mpls file with a hex editor to peek at which .m2ts files are referenced so I can rip those individually and stitch them together, but I have not found success in ripping the individual .m2ts files. If I set one as an input to ffmpeg: ffmpeg -i BDMV/STREAM/00000.m2ts -map 0:v -map 0:a -c:a copy -c:v copy output.mkv, I get back "BDMV/STREAM/00000.m2ts: Invalid data found when processing input". I assume that's because it needs to be decrypted? Not sure how to get ffmpeg to make use of libaacs to decrypt when trying to use .m2ts as input rather than .mpls So, how can I rip specific camera angles from a bluray video using free open source software available on Linux?
Use an up-to-date version of bluray_copy. Version 1.9 is working for me. The program I had been using to rip my blurays, bluray_copy from the bluray_info project was version 1.3, which as of today is still the latest unstable ebuild available in the gentoo repository. With version 1.3, I failed to rip anything other than the default camera angle. I downloaded and compiled the latest source from the git repository and that version correctly ripped the video using the camera angle passed on the command line. I have since created a local gentoo ebuild for the latest tagged version, 1.9, and that also works as intended.
Ripping multi-angle bluray video
1,436,039,802,000
I am streaming the desktop over rtp using ffmpeg from computer A. Here is my ffmpeg code: ffmpeg -f x11grab -framerate 25 -video_size 1920x1080 -i :1.0 -c:v libx264 -preset fast -pix_fmt bgr0 -b:v 3M -g 25 -an -f rtp_mpegts rtp://230.0.0.1:5005 I can play the live stream in vlc in the computer A in "rtp://@230.0.0.1:5005". But I can't play the stream from computer B that is in the same network with computer A by trying to open "rtp://@230.0.0.1:5005" in VLC. If I stream an mp4 file over http, than computer B is able to play it. For http streaming, I simply go to the VLC -> Media -> Stream -> (Adding the mp4 file), and stream in HTTP format in 8080 port on A. Then open it in VLC on the machine B with "http://serverIP:portnumber". What am I doing wrong here?
Why can't receive rtp stream on Ubuntu? Because you are using a multicast address 230.0.0.1 and your current setup does not have a multicast path between the two hosts. So, one way to solve the problem right now is to use unicast transmission. Just change the 230.0.0.1 with the ip address of the host you are going to be watching from. ffmpeg -f x11grab -framerate 25 -video_size 1920x1080 -i :1.0 -c:v libx264 -preset fast -pix_fmt bgr0 -b:v 3M -g 25 -an -f rtp_mpegts rtp://a.b.c.d:5005 where a.b.c.d is the IP address of the Ubuntu host. Then, on the Ubuntu host you can do: vlc rtp://@:5005
Why can't receive rtp stream on Ubuntu?
1,436,039,802,000
I record math lectures for my students using quicktime (audio and video). Quicktime does not offer much control over the input audio gain. I would like to make sure that all my recordings have the same output volume level. Is there a simple way to achieve this using ffmpeg?
This will require audio re-encoding: https://trac.ffmpeg.org/wiki/AudioVolume Also check: https://superuser.com/questions/323119/how-can-i-normalize-audio-using-ffmpeg Lastly normalization does not always help because you may have a relatively quiet audio track with peaks which will make normalization impossible. In this case you'll need to use dynamic range compression, e.g. https://medium.com/@jud.dagnall/dynamic-range-compression-for-audio-with-ffmpeg-and-compand-621fe2b1a892
Use ffmpeg to achieve uniform output volume levels across different video recordings
1,436,039,802,000
My laptop has Ryzen 2200U with a Radeon Vega 3 and I am using it with CentOS 8, KDE 5 and Chromium version 81. I installed x264 and gstreamer-plugin-blah stuffs, but still both Konqueror and Chromium says: Your browser does not currently recognize any of the video formats available. and YouTube suggests for HTML5 support but according to https://www.youtube.com/html5, my browser supports HTML5. Strangely, if I copy the streaming link and paste it into VLC, it works perfectly. I googled it, but most of answers are about for Ubuntu's, and even answer for Fedora doesn't work since epel seems to not have chromium-plugin-ffmpeg package. I also installed RPM Fusion, but there wasn't either. so is there an equivalent on CentOS, and, if there is, what package(s) is/are I have to install?
If you don't feel aversion to proprietary software I'd recommend installing Google Chrome as it supports H.264 by default: https://www.google.com/chrome/ As for Konqueror, installing gstreamer1-plugin-openh264 might help but I'm not sure it's available for CentOS 8.
YouTube Live Streaming doesn't work on CentOS
1,436,039,802,000
Situation: Running macOS 10.13.6 and using bash 5.0.17(1) A lot of subdirectories which hold multiple files. Need to filter out files in subdirectories with a specific extension (.avi). Need to process all .avi files and remux to .mp4 using ffmpeg. ffmpeg uses the following syntax for remuxing: ffmpeg -i in.avi -c copy out.mp4 Output format: in the same folder as the source .avi file, and using the same filename (apart from the .avi extension) Example file structure: $ find . -maxdepth 2 . ./abc ./abc/abc.avi ./xyz ./xyz/xyz.avi.mp4 ./123 ./123/123.avi In this case I would like to filter out the files ./abc/abc.avi and ./123/123.avi , which I can do using regular expressions and find: $ find -E . -iregex ".*\.avi" ./abc/abc.avi ./123/123.avi The desired remuxed .mp4 output filenames would then be: ./abc/abc.mp4 ./123/123.mp4 How can I: using a script, remux all these .avi files to .mp4 container with one command? I am not sure how to pipe the output of find to the input of ffmpeg, and at the same time specify the desired output filenames. delete the original .avi files, but only if the remux was successful?
Credit to Cbhihe's answer, as it got me on the right path. I needed to change some a few things because of the way baseline works on mac and the handling of spaces in filenames. macOS find uses a somewhat special syntax. You can use the following one-liner: find -E . -type f -iregex ".*\.avi" -execdir bash -c 'in=$1; out=${in%.*}.mp4; (ffmpeg -y -i "$in" -c copy "$out") 2>/dev/null && \rm "$in"' shellproc {} \; find -E . -type f -iregex ".*\.avi" searches for files: . indicates to search the current directory, -E indicates extended find which allows to use regular expressions, -iregex ".*\.avi" searches for all `.aviz files in this path, including subdirectories. -execdir bash -c '<COMMAND>' shellproc {} \; executes a COMMAND using bash shell for every search result, executed from the same directory as the search result. The filename is passed on as argument $1. in=$1; out=${in%.*}.mp4; (ffmpeg -y -i "$in" -c copy "$out") 2>/dev/null && \rm "$in" defines input and output filenames, runs ffmpeg to do the remuxing and suppresses the verbose ffmpeg output, and removes the input file only when the remuxing finished succesfully. On Linux systems, the find syntax is a little different: find . -type f -regex ".*\.avi" -execdir bash -c 'in=$1; out=${in%.*}.mp4; (ffmpeg -y -i "$in" -c copy "$out") 2>/dev/null && \rm "$in"' shellproc {} \;
ffmpeg across .avi files in subdirectories
1,436,039,802,000
Ffmpeg is showing its playing an icecast stream but there is no audio coming out. Restarting the program produces audio again. If an icecast feed goes away, ffmpeg keeps running even though theres no audio for some reason. I need to detect this and restart it if theres an issue.
Set stimeout to e.g. 1000000 (10 seconds) - then ffmpeg will quit if it doesn't get any further data in 10 seconds. Then you can e.g. run ffmpeg into a loop.
How can I check from the command line to see if FFMpeg is actively playing audio?
1,552,663,800,000
Try to capture RTSP stream with ffmpeg. Everything goes nice, if I save video to my home folder. Can't save to another directory. ffmpeg says 'Permission denied' even directory premission is 777. In short: ffmpeg -i 'rtsp://192.168.0.161:554/11' -c:v copy -an new.mp4 good ffmpeg -i 'rtsp://192.168.0.161:554/11' -c:v copy -an folder777/new.mp4 Ubuntu Server 18.04.02. ffmpeg snap package v.4.1. Any suggestion?
The thing is I've tried to save video to the folders located at mounted partition. And my snap package have no connection to interface "removable-media". After connection everything works great.
ffmpeg permissions trouble
1,552,663,800,000
I want to play a video at a certain time. Like an alarm. for instance, at 07:00 play video.mp4 I have tried this with crontab and with at, but no success yet
I wrote a little script for that: #!/bin/bash [ "$1" = "-q" ] && shift && quiet=true || quiet=false hms=(${1//:/ }) printf -v now '%(%s)T' -1 printf -v tzoff '%(%z)T\n' $now tzoff=$((0${tzoff:0:1}(3600*${tzoff:1:2}+60*${tzoff:3:2}))) slp=$(((86400+(now-now%86400)+10#$hms*3600+10#${hms[1]}*60+${hms[2]}-tzoff-now)%86400)) $quiet || printf 'Alarm goes off at %(%c)T.' $((now+slp)) sleep $slp mplayer /path/to/video.mp4 Call it with the desired time like alarm.bash 7, alarm.bash 7:1:3 or alarm.bash 07:01:03. You may use the -q option to disable the terminal output. Designed to serve as an alarm clock it's not possible to set a time more than 23:59:59 in the future with this script – I suggest to combine it with cron if necessary.
Start playing a video at a certain time
1,552,663,800,000
I am looking for a way to create CBR TS file from a high bitrate MXF input file. I have tried to use ffmpeg, but apparently it doesn't do a good job of creating CBR output file so right now I am a bit clueless what I can use. I have tried to use: ffmpeg -i input.mxf -copyts -c copy -muxrate 200M -f mpegts output.ts I only want to add some stuffing. I have also tried to transcode the video with the command: ffmpeg -i input.mxf -vcodec libx264 -b:v 150M -minrate:v 150M -maxrate:v 150M -bufsize:v 140M -acodec mp2 -ac 2 -b:a 192k -f mpegts output.ts but it didn't work out either. I want the output to be completely flat. This could be done by setting the muxrate to a much higher value or by achieving completely CBR video bitrate. The ffmpeg with which I have tried is 3.2.4.
I found a way to achieve rather nice looking and smooth output CBR with 10-15% of stuffing. Unfortunately it requires transcoding of the original file: $ffmpeg -i input.mxf \ -c:v libx264 \ -x264opts nal-hrd=cbr \ -b:v 30M -minrate:v 30M -maxrate:v 30M -muxrate 35M -bufsize:v 25M \ -acodec aac -ac 2 -b:a 128k \ -f mpegts output.ts This command will create completely CBR TS from an input file. Here it is important the video bitrate (b:v) to be equal to the video maximal and minimal video bitrate, the muxrate has to be 10-15% higher than the set video bitrate including the audio bitrate and the buffsize to be around 70% of the video bitrate.
Create CBR TS file from MXF file
1,552,663,800,000
I am trying to install ffmpeg on a CentOS 6.8 server and I am getting some errors relating to required libs. How do I install those missing libs? Where do I find them? What should I do to install FFMPEG? Here are the errors: Error: Package: libavdevice-2.6.8-3.el7.nux.x86_64 (nux-dextop) Requires: libcdio_paranoia.so.1()(64bit) Error: Package: ffmpeg-libs-2.6.8-3.el7.nux.x86_64 (nux-dextop) Requires: libgnutls.so.28(GNUTLS_1_4)(64bit) Error: Package: libavdevice-2.6.8-3.el7.nux.x86_64 (nux-dextop) Requires: libopenal.so.1()(64bit) Error: Package: ffmpeg-libs-2.6.8-3.el7.nux.x86_64 (nux-dextop) Requires: libgmp.so.10()(64bit) Error: Package: ffmpeg-compat-0.6.7-9.el7.nux.x86_64 (nux-dextop) Requires: libjack.so.0()(64bit) Error: Package: xvidcore-1.3.2-5.el7.nux.x86_64 (nux-dextop) Requires: libm.so.6(GLIBC_2.15)(64bit) Error: Package: ffmpeg-libs-2.6.8-3.el7.nux.x86_64 (nux-dextop) Requires: liblzma.so.5()(64bit) Error: Package: ffmpeg-libs-2.6.8-3.el7.nux.x86_64 (nux-dextop) Requires: libopus.so.0()(64bit) Error: Package: libavdevice-2.6.8-3.el7.nux.x86_64 (nux-dextop) Requires: libcdio_cdda.so.1()(64bit) Error: Package: libavdevice-2.6.8-3.el7.nux.x86_64 (nux-dextop) Requires: libcdio_cdda.so.1(CDIO_CDDA_1)(64bit) Error: Package: ffmpeg-libs-2.6.8-3.el7.nux.x86_64 (nux-dextop) Requires: liblzma.so.5(XZ_5.0)(64bit) Error: Package: ffmpeg-compat-0.6.7-9.el7.nux.x86_64 (nux-dextop) Requires: libgmp.so.10()(64bit) Error: Package: librtmp-2.4-2.20131205.gitdc76f0a.el7.nux.x86_64 (nux-dextop) Requires: libgnutls.so.28(GNUTLS_2_12)(64bit) Error: Package: librtmp-2.4-2.20131205.gitdc76f0a.el7.nux.x86_64 (nux-dextop) Requires: libgnutls.so.28(GNUTLS_1_4)(64bit) Error: Package: ffmpeg-libs-2.6.8-3.el7.nux.x86_64 (nux-dextop) Requires: libgnutls.so.28()(64bit) Error: Package: ffmpeg-libs-2.6.8-3.el7.nux.x86_64 (nux-dextop) Requires: libass.so.5()(64bit) Error: Package: ffmpeg-libs-2.6.8-3.el7.nux.x86_64 (nux-dextop) Requires: libopenjpeg.so.1()(64bit) Error: Package: ffmpeg-compat-0.6.7-9.el7.nux.x86_64 (nux-dextop) Requires: libopenjpeg.so.1()(64bit) Error: Package: libavdevice-2.6.8-3.el7.nux.x86_64 (nux-dextop) Requires: libcdio_paranoia.so.1(CDIO_PARANOIA_1)(64bit) Error: Package: x265-libs-1.9-1.el7.nux.x86_64 (nux-dextop) Requires: libm.so.6(GLIBC_2.15)(64bit) Error: Package: ffmpeg-libs-2.6.8-3.el7.nux.x86_64 (nux-dextop) Requires: libgnutls.so.28(GNUTLS_3_0_0)(64bit) Error: Package: librtmp-2.4-2.20131205.gitdc76f0a.el7.nux.x86_64 (nux-dextop) Requires: libgnutls.so.28()(64bit) You could try using --skip-broken to work around the problem You could try running: rpm -Va --nofiles --nodigest RepoList: 200 packages excluded due to repository protections repo id repo name status base CentOS-6 - Base 6,634+62 dag DAG RPM Repository 4,634+84 extras CentOS-6 - Extras 61 hgdedi HG Monitoring Repo 369 ksplice-uptrack Ksplice Uptrack for CentOS 14 nodesource Node.js Packages for Enterprise Linux 6 - x86_64 49 nux-dextop Nux.Ro RPMs for general desktop use 2,347+123 rpmforge RHEL 6 - RPMforge.net - dag 4,634+84 ul UL 58 ul_hostgator UL_HostGator 8 updates CentOS-6 - Updates 137 repolist: 18,945
looks like you have been using both DAG and NUX repos. NUX repos are more updated, so when yum looks for dependencies it gives those for CentOS 7, I'm sure that is the reason dependencies do not get installed. If you disable NUX for a moment, and only use DAG for this purpose, I think the instructions mentioned here would work fine. I have tried them in CentOS 6 myself. https://chrisjean.com/install-ffmpeg-and-ffmpeg-php-on-centos-easily/ I hope that works for you.
How do I install ffmpeg on CentOS 6.8 with all dependencies? I am getting many "Error package ... requires ..."
1,552,663,800,000
I'm trying to use a url piped from tshark into a while loop. while read line ; do echo "$line" ffmpeg -i "$line" -c copy "filename" done < <(tshark -i tun0 -B 50 -P -V -q -l -Y 'http matches "(?<=\[Full request URI: )(http://mywebsite.com/file.*)(?=\])"' 2>&1 | grep --line-buffered -Po "(?<=\[Full request URI: )(http://mywebsite.com/file.*)(?=\])" | unbuffer -p uniq) Inside the loop, I'm able to echo $line just fine; and it looks like this: http://mywebsite.com/file?eid=5345944&fmt=5&app_id=214748364&range=20-30&etsp=1456777145&hmac=1K9nwkA8TOgtOXAsakSfMMVWsuE But for some reason, I'm unable to use this same "line" variable to feed ffmpeg, inside the same while loop. Doing ffmpeg -i "$line" -c copy "filename" results in (all quotes are accurately copy pasted) [http @ 0x1ab5ec0] HTTP error 400 Bad Request: Server returned 400 Bad Request5345932&fmt=5&app_id=214748364&range=20-30&etsp=1456779359&hmac=35B2lA6D0zfR2DmfdPS4ZcilYxg On the other hand, copying the url (from the echo output), double quoting it and using the same ffmpeg command in a terminal works perfectly. Also, for some reason, the command is truncated when running the script with -xv, in such way that it does not show the full "+ ffmpeg -i 'http://...." line as it should.
Edit: It appears the grep is capturing a newline or carriage return which is fine when you submit it as a one off command but not fine in the loop. Add tr -d '\r'
How to use a greped url provided by tshark inside a bash script?