I want save youtube stream with ffmpeg. Can anyone help me?
How can I capture youtube streams with ffmpeg
How to convert a 1080p to 1080i using FFMPEG
I need the FFMPEG command which could convert a 1080p to 1080i (both top bottom and bottom up) for H264 codec. Please help.
Thanks
ffmpeg -progress parameter not POST-ing any data
I have noticed that the ffmpeg -progress parameter doesn't work with URLs. It only works with files.
For eaxmple:
ffmpeg -progress log_file.txt -i http://123.123.123.123:8888/live -c copy -f mpegts out.ts
works and log_file.txt is filled with data.
While
ffmpeg -progress http://127.0.0.1:8888/progress.php -i http://123.123.123.123:8888/live -c copy -f mpegts out.ts
does not post any data to http://127.0.0.1:8888/progress.php script.
It's not a web server configuration issue because:
curl --data "param1=value1¶m2=value2" http://127.0.0.1:9790/progress.php
creates a file with the $_POST array content:
Array ( [param1] => value1 [param2] => value2 )
Does -progress really make ffmpeg to post data to urls?
-- http://127.0.0.1:8888/progress.phpRecord video with Accord.net (AForge)
I used Accord.net (AForge) for connect to the webcam and record video But stored videos is slow motion ... this my project :
using AForge.Video;
using AForge.Video.DirectShow;
using AForge.Video.FFMPEG;
using System;
using System.Drawing;
using System.IO;
using System.Threading;
using System.Windows.Forms; namespace CaptureWebcam
{ public partial class Form1 : Form { private VideoCaptureDeviceForm captureDevice; private string path = Path.GetDirectoryName(Application.ExecutablePath) + @"\Videos\"; private FilterInfoCollection videoDevice; private VideoCaptureDevice videoSource; private VideoFileWriter FileWriter = new VideoFileWriter(); private Bitmap video; bool isRecord = false; public Form1() { InitializeComponent(); } private void videoSource_NewFrame(object sender, NewFrameEventArgs eventArgs) { video = (Bitmap)eventArgs.Frame.Clone(); pictureBox1.Image = (Bitmap)eventArgs.Frame.Clone(); } private void btnStartCam_Click(object sender, EventArgs e) { videoDevice = new FilterInfoCollection(FilterCategory.VideoInputDevice); captureDevice = new VideoCaptureDeviceForm(); if (captureDevice.ShowDialog(this) == DialogResult.OK) { videoSource = captureDevice.VideoDevice; videoSource = captureDevice.VideoDevice; videoSource.NewFrame += new NewFrameEventHandler(videoSource_NewFrame); videoSource.Start(); timer1.Enabled = true; } //videoSource.DisplayPropertyPage(IntPtr.Zero); } private Thread workerThread = null; private bool stopProcess = false; private void recordLiveCam() { if (!stopProcess) { while (isRecord) { FileWriter.WriteVideoFrame(video); } FileWriter.Close(); } else { workerThread.Abort(); } } private void btnRecord_Click(object sender, EventArgs e) { //try //{ isRecord = true; if (!Directory.Exists(path)) { Directory.CreateDirectory(path); } int h = captureDevice.VideoDevice.VideoResolution.FrameSize.Height; int w = captureDevice.VideoDevice.VideoResolution.FrameSize.Width; FileWriter.Open(path + "recorded at " + DateTime.Now.ToString("HH-mm-ss") + ".mp4", w, h, 25, VideoCodec.MPEG4); stopProcess = false; workerThread = new Thread(new ThreadStart(recordLiveCam)); workerThread.Start(); //} //catch (Exception ex) //{ // MessageBox.Show(ex.Message); //} } private void Form1_Load(object sender, EventArgs e) { } private void btnStopRecord_Click(object sender, EventArgs e) { stopProcess = true; isRecord = false; FileWriter.Close(); workerThread.Abort(); video = null; } private void btnStopCam_Click(object sender, EventArgs e) { try { if (videoSource.IsRunning) { videoSource.SignalToStop(); videoSource.Stop(); pictureBox1.Image = null; pictureBox1.Invalidate(); if (FileWriter.IsOpen) { FileWriter.Close(); video = null; } } else return; } catch { videoSource.Stop(); FileWriter.Close(); video = null; } } float fts = 0.0f; private void timer1_Tick(object sender, EventArgs e) { fts = videoSource.FramesReceived; label1.Text = "Frame Rate : " + fts.ToString("F2") + " fps"; } }
}
And when click the btnStopRecord following error :
Additional information: Attempted to read or write protected memory. This is often an indication that other memory is corrupt.
need help configuring ffmpeg to decode raw AAC with android ndk
I've got an android app that gets raw AAC bytes from an external device and I want to decode that data but I can't seem to get the decoder to work, yet ffmpeg seems to work fine for decoding an mp4 file that contains the same audio data (verified with isoviewer). Recently I was able to get this ffmpeg library on android to decode video frames from the same external device but audio won't seem to work.
Here is the ffmpeg output for the file with the same data:
$ ffmpeg -i Video_2000-01-01_0411.mp4
ffmpeg version 2.6.1 Copyright (c) 2000-2015 the FFmpeg developers built with Apple LLVM version 6.0 (clang-600.0.57) (based on LLVM 3.5svn) configuration: --prefix=/usr/local/Cellar/ffmpeg/2.6.1 --enable-shared --enable-pthreads --enable-gpl --enable-version3 --enable-hardcoded-tables --enable-avresample --cc=clang --host-cflags= --host-ldflags= --enable-libx264 --enable-libmp3lame --enable-libvo-aacenc --enable-libxvid --enable-vda libavutil 54. 20.100 / 54. 20.100 libavcodec 56. 26.100 / 56. 26.100 libavformat 56. 25.101 / 56. 25.101 libavdevice 56. 4.100 / 56. 4.100 libavfilter 5. 11.102 / 5. 11.102 libavresample 2. 1. 0 / 2. 1. 0 libswscale 3. 1.101 / 3. 1.101 libswresample 1. 1.100 / 1. 1.100 libpostproc 53. 3.100 / 53. 3.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'AXON_Flex_Video_2000-01-01_0411.mp4': Metadata: major_brand : mp42 minor_version : 1 compatible_brands: isom3gp43gp5 Duration: 00:00:15.73, start: 0.000000, bitrate: 1134 kb/s Stream #0:0(eng): Audio: aac (LC) (mp4a / 0x6134706D), 8000 Hz, mono, fltp, 40 kb/s (default) Metadata: handler_name : soun Stream #0:1(eng): Video: mpeg4 (Simple Profile) (mp4v / 0x7634706D), yuv420p, 640x480 [SAR 1:1 DAR 4:3], 1087 kb/s, 29.32 fps, 26.58 tbr, 90k tbn, 1k tbc (default) Metadata: handler_name : vide
Here is my ndk code for setting up and decoding the audio:
jint ffmpeg_init(JNIEnv * env, jobject this) { audioCodec = avcodec_find_decoder(AV_CODEC_ID_AAC); if (!audioCodec) { LOGE("audio codec %d not found", AV_CODEC_ID_AAC); return -1; } audioContext = avcodec_alloc_context3(audioCodec); if (!audioContext) { LOGE("Could not allocate codec context"); return -1; } int openRet = avcodec_open2(audioContext, audioCodec, NULL); if (openRet < 0) { LOGE("Could not open codec, error:%d", openRet); return -1; } audioContext->sample_rate = 8000; audioContext->channel_layout = AV_CH_LAYOUT_MONO; audioContext->profile = FF_PROFILE_AAC_LOW; audioContext->bit_rate = 48 * 1024; audioContext->sample_fmt = AV_SAMPLE_FMT_FLTP; // unsigned char extradata[] = {0x15, 0x88}; // audioContext->extradata = extradata; // audioContext->extradata_size = sizeof(extradata); audioFrame = av_frame_alloc(); if (!audioFrame) { LOGE("Could not create audio frame"); return -1; }
} jint ffmpeg_decodeAudio(JNIEnv *env, jobject this, jbyteArray aacData, jbyteArray output, int offset, int len) { LOGI("ffmpeg_decodeAudio()"); char errbuf[128]; AVPacket avpkt = {0}; av_init_packet(&avpkt); LOGI("av_init_packet()"); int error, got_frame; uint8_t* buffer = (uint8_t *) (*env)->GetByteArrayElements(env, aacData,0); uint8_t* copy = av_malloc(len); memcpy(copy, &buffer[offset], len); av_packet_from_data(&avpkt, copy, len); if ((error = avcodec_decode_audio4(audioContext, audioFrame, &got_frame, &avpkt)) < 0) { ffmpeg_log_error(error); av_free_packet(&avpkt); return error; } if (got_frame) { LOGE("Copying audioFrame->extended_data to output jbytearray, linesize[0]:%d", audioFrame->linesize[0]); (*env)->SetByteArrayRegion(env, output, 0, audioFrame->linesize[0], *audioFrame->extended_data); } return 0; }
As you can see I've got an init function that opens the decoder and creates the context, these things all work fine, without error. However when I call avcodec_decode_audio4 I get an error :
FFMPEG error: -1094995529, Invalid data found when processing input
I've tried all sorts of combinations of AVCodecContext properties. I'm not sure which I need to set for the decoder to do it's job but from reading online I should just need to set the channel layout and the sample_rate (which I've tried by themself). I've also tried setting the extradata/extradata_size parameters to that which should match the video settings per: http://wiki.multimedia.cx/index.php?title=MPEG-4_Audio But no luck.
Since the device we're getting packets from sends aac data that have no sound at the beginning (but are valid packets), I've tried to just send those since they definitely should decode correctly.
Here is an example of the initial audio packets that are of silence:
010c9eb43f21f90fc87e46fff10a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5dffe214b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4bbd1c429696969696969696969696969696969696969696969696969696969696969696969696969696969696969696969696969696969696969696969696969696969696969696969696969696969696969696969696978
Note that the data shown above is just a hex encoding of the data that I'm putting in AVPacket, and it was sent from an external device to the android application. My application doesn't have direct access to the file though so I need to decode the raw frames/samples as I get them. When I look at the audio track data in isoviewer I can see that the audio track's first sample is the same data as what I got from the device that contained that file (thus, the external device is just sending me the sample's raw data). I believe this data can be derived from reading stsz (sample size) box starting at stco (chunk offset) boxes from the mdat box of the file.
Also, isoviewer shows the esds box as having the following:
ESDescriptor{esId=0, streamDependenceFlag=0, URLFlag=0, oCRstreamFlag=0, streamPriority=0, URLLength=0, URLString='null', remoteODFlag=0, dependsOnEsId=0, oCREsId=0, decoderConfigDescriptor=DecoderConfigDescriptor{objectTypeIndication=64, streamType=5, upStream=0, bufferSizeDB=513, maxBitRate=32000, avgBitRate=32000, decoderSpecificInfo=null, audioSpecificInfo=AudioSpecificConfig{configBytes=1588, audioObjectType=2 (AAC LC), samplingFrequencyIndex=11 (8000), samplingFrequency=0, channelConfiguration=1, syncExtensionType=0, frameLengthFlag=0, dependsOnCoreCoder=0, coreCoderDelay=0, extensionFlag=0, layerNr=0, numOfSubFrame=0, layer_length=0, aacSectionDataResilienceFlag=false, aacScalefactorDataResilienceFlag=false, aacSpectralDataResilienceFlag=false, extensionFlag3=0}, configDescriptorDeadBytes=, profileLevelIndicationDescriptors=[[]]}, slConfigDescriptor=SLConfigDescriptor{predefined=2}}
And the binary is this:
00 00 00 30 65 73 64 73 00 00 00 00 03 80 80 80
1f 00 00 00 04 80 80 80 14 40 15 00 02 01 00 00
7d 00 00 00 7d 00 05 80 80 80 02 15 88 06 01 02
-- http://wiki.multimedia.cx/index.php?title=MPEG-4_Audio
How to save HLS stream to local disk in source code by ffmpeg?
The HSL stream is divided into multiple ts files, now I want to save these files and the m3u8 file to local disk in source code by calling ffmpeg APIs.
Does ffmpeg support this function?
If so how to implement the function?
Otherwise can anyone give me some suggestions about how to achieve this goal?
Many thanks.
How do to build and add ffmpeg in java project? [on hold]
I am working in video streaming in Java.Now I need to play video in all browser along with different Operating system.I having some troubles regarding FFMPEG library compliation. How do to build ffmpeg library for java application ? How do to add that in my project? Anyone please give idea.
scaling a video captured in portrait mode using FFMPEG
I'm trying to rotate a video on my android app using FFMPEG and then uploading it to the server. The following command does the job for me:
-y -noautorotate -i inputPath -vf transpose=1,scale=360:640,setsar=1 -metadata:s:v rotate=0
The aspect ratio of the original video is 16:9. I'm trying to maintain the aspect ratio after rotation so I add padding on either side of the video to scale it properly with the following command:
-y -noautorotate -i inputPath -vf transpose=1,scale=iw*min(360/iw\,640/ih):ih*min(360/iw\,640/ih),pad=640:360:(360-iw)/2:(640-ih)/2,setsar=1 -metadata:s:v rotate=0
The video scales perfectly on the website but now the issue is that I can see the black bit when I play it on the android app which I don't want. I don't mind the black bits when I play the video on the website but on the app, it looks terrible. The container of the video on the app has height "280dp" and width of "380dp". How can I scale the video so the black sides don't show when I play a video on the app. Any solutions/suggestions?
This is what the video looks like when I play on the website:
This is what it looks like on the app
--

How to capture a layered window with transparency background properly? (using BitBlt)
I want to capture a WPF window (I think WPF window is layered window) with transparency background.
To do that I tried FFmpeg, But:
1 - If I set AllowTransparency
(this is a property of WPF window) to false
,I can capture the window by gdigrab (this is an ffmpeg device), but output has black background.(I don't want black background)
2 - If I set AllowTransparency
to true
then gdigrab won't work. (get black frame only)
I have read David's nice article, he has said:
if you use BitBlt to do this, you could “or in” the CAPTUREBLT flag if you wanted to capture windows that are layered
The gdigrab uses BitBlt, this is gdigrab.c code snippet:
/* Blit screen grab */ if (!BitBlt(dest_hdc, 0, 0, clip_rect.right - clip_rect.left, clip_rect.bottom - clip_rect.top, source_hdc, clip_rect.left, clip_rect.top, SRCCOPY | CAPTUREBLT)) { WIN32_API_ERROR("Failed to capture image"); return AVERROR(EIO); }
You can see the flags . (SRCCOPY | CAPTUREBLT
).
Please tell me :
1- Why the gdigrab can not capture a WPF window properly?
2 - What changes in this code should be done to do this?
(Sorry for my English, I used translate.google)
Thanks
-- David's nice articleMono: FFmpeg on Linux
I am using FFmpeg for my application on Windows. Now I would like it to work on Linux. On Windows I would use:
var process = Process.Start(new ProcessStartInfo { FileName = "ffmpeg", Arguments = $"-i {pathOrUrl} -f s16le -ar 48000 -ac 2 pipe:1 -loglevel quiet", UseShellExecute = false, RedirectStandardOutput = true, RedirectStandardError = false, CreateNoWindow = true, });
How would I do the same on Linux? Because this is not working there.
ffmpeg : mix audio and video of different length
I have 2 files: 1 video file (without sound) - length 6 seconds, 1 audio - length 10 seconds. Both audio and video contains same conversation, but audio starts 4 seconds earlier and after that was started video.
[----------] audio [------] video
So, I want to mix them together to video file with length 10 seconds where first 4 seconds black screen with audio then goes real video and audio.
[====------] audio+video (where '=' is black screen)
I hope my description was clear enough ). How can I do this with ffmpeg or gstreamer ?
How can I convert videos in php without using ffmpeg?
I made a code in php that extracts mp3 audio track from a video file using the exec
function to execute ffmpeg
.
All works fine on a local server using WAMP on windows. But when deploying the script to the server, I can not run ffmpeg
anymore because the server does not have the software.
Is there a class in pure PHP, or an API that is capable of the same or at least the basic ffmpeg functionality?
ffmpeg installation undefined reference to
I used this official link to download ffmpeg and compile it. I know it creates a static library and I have to reference the library folder. When i try to compile this tutorial . i get this errors
tutorial01.c: In function ‘main':
tutorial01.c:96:5: warning: ‘codec' is deprecated [-Wdeprecated- declarations] if(pFormatCtx->streams[i]->codec->codec_type==AVMEDIA_TYPE_VIDEO) { ^
In file included from tutorial01.c:27:0: /home/osboxes/ffmpeg_build/include/libavformat/avformat.h:893:21: note: declared here AVCodecContext *codec; ^
tutorial01.c:104:3: warning: ‘codec' is deprecated [-Wdeprecated-declarations]
pCodecCtxOrig=pFormatCtx->streams[videoStream]->codec;
^
In file included from tutorial01.c:27:0:
/home/osboxes/ffmpeg_build/include/libavformat/avformat.h:893:21: note: declared here AVCodecContext *codec; ^
tutorial01.c:113:3: warning: ‘avcodec_copy_context' is deprecated [-Wdeprecated-declarations] if(avcodec_copy_context(pCodecCtx, pCodecCtxOrig) != 0) { ^
In file included from tutorial01.c:26:0:
/home/osboxes/ffmpeg_build/include/libavcodec/avcodec.h:4240:5: note: declared here
int avcodec_copy_context(AVCodecContext *dest, const AVCodecContext *src); ^
tutorial01.c:131:3: warning: ‘avpicture_get_size' is deprecated [-Wdeprecated-declarations] numBytes=avpicture_get_size(AV_PIX_FMT_RGB24, pCodecCtx->width, ^ In file included from tutorial01.c:26:0: /home/osboxes/ffmpeg_build/include/libavcodec/avcodec.h:5467:5: note: declared here int avpicture_get_size(enum AVPixelFormat pix_fmt, int width, int height); ^ tutorial01.c:138:3: warning: ‘avpicture_fill' is deprecated [-Wdeprecated-declarations] avpicture_fill((AVPicture *)pFrameRGB, buffer, AV_PIX_FMT_RGB24, ^ In file included from tutorial01.c:26:0: /home/osboxes/ffmpeg_build/include/libavcodec/avcodec.h:5452:5: note: declared here int avpicture_fill(AVPicture *picture, const uint8_t *ptr, ^ tutorial01.c:160:7: warning: ‘avcodec_decode_video2' is deprecated [-Wdeprecated-declarations] avcodec_decode_video2(pCodecCtx, pFrame, &frameFinished, &packet); ^ In file included from tutorial01.c:26:0: /home/osboxes/ffmpeg_build/include/libavcodec/avcodec.h:4811:5: note: declared here int avcodec_decode_video2(AVCodecContext *avctx, AVFrame *picture, ^ tutorial01.c:177:5: warning: ‘av_free_packet' is deprecated [-Wdeprecated-declarations] av_free_packet(&packet); ^ In file included from tutorial01.c:26:0: /home/osboxes/ffmpeg_build/include/libavcodec/avcodec.h:4472:6: note: declared here void av_free_packet(AVPacket *pkt); ^
/tmp/ccHYDd9j.o: In function `main':
tutorial01.c:(.text+0x178): undefined reference to `av_register_all'
tutorial01.c:(.text+0x19f): undefined reference to `avformat_open_input'
tutorial01.c:(.text+0x1c1): undefined reference to `avformat_find_stream_info'
tutorial01.c:(.text+0x1f6): undefined reference to `av_dump_format'
tutorial01.c:(.text+0x2ab): undefined reference to `avcodec_find_decoder'
tutorial01.c:(.text+0x2ea): undefined reference to `avcodec_alloc_context3'
tutorial01.c:(.text+0x304): undefined reference to `avcodec_copy_context'
tutorial01.c:(.text+0x348): undefined reference to `avcodec_open2'
tutorial01.c:(.text+0x35b): undefined reference to `av_frame_alloc'
tutorial01.c:(.text+0x367): undefined reference to `av_frame_alloc'
tutorial01.c:(.text+0x3a4): undefined reference to `avpicture_get_size'
tutorial01.c:(.text+0x3ba): undefined reference to `av_malloc'
tutorial01.c:(.text+0x3ef): undefined reference to `avpicture_fill'
tutorial01.c:(.text+0x439): undefined reference to `sws_getContext'
tutorial01.c:(.text+0x47d): undefined reference to `avcodec_decode_video2'
tutorial01.c:(.text+0x4de): undefined reference to `sws_scale'
tutorial01.c:(.text+0x527): undefined reference to `av_free_packet'
tutorial01.c:(.text+0x53d): undefined reference to `av_read_frame'
tutorial01.c:(.text+0x551): undefined reference to `av_free'
tutorial01.c:(.text+0x560): undefined reference to `av_frame_free'
tutorial01.c:(.text+0x56f): undefined reference to `av_frame_free'
tutorial01.c:(.text+0x57b): undefined reference to `avcodec_close'
tutorial01.c:(.text+0x58a): undefined reference to `avcodec_close'
tutorial01.c:(.text+0x599): undefined reference to `avformat_close_input'
collect2: error: ld returned 1 exit status
i used this sudo gcc -I /home/osboxes/ffmpeg_build/include -L /home/osboxes/ffmpeg_build/lib -lavcode -lavformat -lavutil -lswscale -lz -lm tutorial01.c
my ffmpeg is
ffmpeg version N-82113-g490c6bd Copyright (c) 2000-2016 the FFmpeg developers built with gcc 5.4.0 (Ubuntu 5.4.0-6ubuntu1~16.04.2) 20160609 configuration: --prefix=/home/osboxes/ffmpeg_build --pkg-config-flags=--static --extra-cflags=-I/home/osboxes/ffmpeg_build/include --extra-ldflags=- L/home/osboxes/ffmpeg_build/lib --bindir=/home/osboxes/bin --enable-gpl --enable-libass --enable-libfdk-aac --enable-libfreetype --enable-libopus --enable-libx264 --enable-nonfree libavutil 55. 33.100 / 55. 33.100 libavcodec 57. 63.103 / 57. 63.103 libavformat 57. 55.100 / 57. 55.100 libavdevice 57. 0.103 / 57. 0.103 libavfilter 6. 64.100 / 6. 64.100 libswscale 4. 1.100 / 4. 1.100 libswresample 2. 2.100 / 2. 2.100 libpostproc 54. 0.100 / 54. 0.100
Hyper fast Audio and Video encoder
usage: ffmpeg [options] [[infile options] -i infile]... {[outfile options] outfile}...
the tutorial code was update to work on new ffmpeg versions.I'm using VM Ubuntu 16.0.4 64 bit fresh install on windows 10 host.
so why I'm having all of these errors ?
-- link, thisMAINTAINERS: add myself for audiotoolbox
lavf: add AV_DISPOSITION_TIMED_THUMBNAILS
lavf: add AV_DISPOSITION_TIMED_THUMBNAILS Reviewed-By: Michael Niedermayer
- [DH] doc/APIchanges
- [DH] doc/ffprobe.xsd
- [DH] ffprobe.c
- [DH] libavformat/avformat.h
- [DH] libavformat/version.h
- [DH] tests/ref/fate/concat-demuxer-extended-lavf-mxf
- [DH] tests/ref/fate/concat-demuxer-extended-lavf-mxf_d10
- [DH] tests/ref/fate/concat-demuxer-simple1-lavf-mxf
- [DH] tests/ref/fate/concat-demuxer-simple1-lavf-mxf_d10
- [DH] tests/ref/fate/concat-demuxer-simple2-lavf-ts
- [DH] tests/ref/fate/ffprobe_compact
- [DH] tests/ref/fate/ffprobe_csv
- [DH] tests/ref/fate/ffprobe_default
- [DH] tests/ref/fate/ffprobe_flat
- [DH] tests/ref/fate/ffprobe_ini
- [DH] tests/ref/fate/ffprobe_json
- [DH] tests/ref/fate/ffprobe_xml
vf_colorspace: don't spam console with warnings if range is unspecified.
How to install ffmpeg for PHP
I've successfully installed ffmpeg using ssh, as root, on my dedicated server (CentOS 7).
ffmpeg works fine - but now I need to use it without root access.
When i try to use ffmpeg without root access, I get the following error :
ffmpeg: error while loading shared libraries: libx264.so.148: cannot open shared object file: No such file or directory
The final goal is to be able to use ffmpeg inside my PHP scripts which do not root access.
Convert video from FFMPEG than video is rotated [duplicate]
This question already has an answer here:
I have created one Application API, in this Application API When i have upload video from file manager than video is rotated but when i have capture and direct upload than video is perfect work.
i have use this command transpose=1
but capture video is rotated 90 degree when upload and if i have remove this command than file manager video is rotated.
so please give me solutions.
/usr/bin/ffmpeg -i video.mp4 -i watermark.png -filter_complex 'overlay=10:10' -s 640x1280 -b 512k -vcodec mpeg1video -acodec copy -vcodec h264 -acodec aac -strict -2 output_video.mp4
file manager upload video is auto rotated on 270 degree, i have not use any rotated command but capture video is work perfect.
SDL2 C++ Capturing Video of Renderer Animation/Sprite
I have an animation/sprite created using SDL2. The animation works fine when it is being rendered to a screen. But now I also want it to be recorded into a video file (locally stored). For this, I am planning on using FFmpeg APIs, to which I'll be sending a raw RGB pixel data array.
My problem is with fetching the data from SDL2 APIs.
What I've tried is:
// From http://stackoverflow.com/questions/30157164/sdl-saving-window-as-bmp
SDL_Surface *sshot = SDL_CreateRGBSurface(0, 750, 750, 32, 0x00ff0000, 0x0000ff00, 0x000000ff, 0xff000000);
SDL_RenderReadPixels(gRenderer, NULL, SDL_PIXELFORMAT_ARGB8888, sshot->pixels, sshot->pitch); // From https://wiki.libsdl.org/SDL_RWFromMem
char fName[50];
sprintf(fName, "/tmp/a/ss%03d.bmp", fileCnt); char bitmap[310000];
SDL_RWops *rw = SDL_RWFromMem(bitmap, sizeof(bitmap));
SDL_SaveBMP_RW(sshot, rw, 1);
Above does not work. But dumping a single frame into a file with following code works:
SDL_SaveBMP(sshot, "/tmp/alok1/ss.bmp")
This obviously is not an acceptable solution - Writing to thousands of BMPs and then using FFmpeg from command-line to create a video.
What am I doing wrong? How do you extract data from SDL_RWops? Is the use of SDL_RWFromMem the right approach to my problem statement?
-- SDL_RWops, SDL_RWFromMemHow can I reset the path of FFMPEG in Java?
I accidentally set the path for FFMPEG to a different folder, and I can't change it back.
I'm using Processing (the API and its IDE) and the user created Video Export library to capture and write an mp4 video file. The library required FFMPEG, so I downloaded and installed it. After I installed it, I ran the code, and the library called Java to request the path of FFMPEG. I wrongly set the path to a different folder, which I eventually deleted. I knew I set it to the wrong folder, and ran the code again to see if I could trigger the Java prompt again to reset the folder.
The console response I received was: (I'm sure most of it doesn't have to do with the actual issue. However, I wanted to show all of it in case it somehow does.)
Oct 24, 2016 10:23:25 PM java.util.prefs.WindowsPreferences WARNING: Could not open/create prefs root node Software\JavaSoft\Prefs at root 0x80000002. Windows RegCreateKeyEx(...) returned error code 5. java.io.IOException: Cannot run program "C:...\Processing\Octree_Graphics\data\FFMPEG\ff-prompt.bat": CreateProcess error=2, The system cannot find the file specified at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048) at com.hamoid.VideoExport.startFfmpeg(Unknown Source) at com.hamoid.VideoExport.initialize(Unknown Source) at com.hamoid.VideoExport.saveFrame(Unknown Source) at Octree_Graphics.draw(Octree_Graphics.java:90) at processing.core.PApplet.handleDraw(PApplet.java:2399) at processing.opengl.PSurfaceJOGL$DrawListener.display(PSurfaceJOGL.java:731) at jogamp.opengl.GLDrawableHelper.displayImpl(GLDrawableHelper.java:692) at jogamp.opengl.GLDrawableHelper.display(GLDrawableHelper.java:674) at jogamp.opengl.GLAutoDrawableBase$2.run(GLAutoDrawableBase.java:443) at jogamp.opengl.GLDrawableHelper.invokeGLImpl(GLDrawableHelper.java:1293) at jogamp.opengl.GLDrawableHelper.invokeGL(GLDrawableHelper.java:1147) at com.jogamp.newt.opengl.GLWindow.display(GLWindow.java:759) at com.jogamp.opengl.util.AWTAnimatorImpl.display(AWTAnimatorImpl.java:81) at com.jogamp.opengl.util.AnimatorBase.display(AnimatorBase.java:452) at com.jogamp.opengl.util.FPSAnimator$MainTask.run(FPSAnimator.java:178) at java.util.TimerThread.mainLoop(Timer.java:555) at java.util.TimerThread.run(Timer.java:505) Caused by: java.io.IOException: CreateProcess error=2, The system cannot find the file specified at java.lang.ProcessImpl.create(Native Method) at java.lang.ProcessImpl.(ProcessImpl.java:386) at java.lang.ProcessImpl.start(ProcessImpl.java:137) at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029) ... 17 more VideoExport error: Ffmpeg failed. Study C:...\Processing\Octree_Graphics\basic.mp4.txt for more details.
I need to find a way to reset the path of FFMPEG that Java has. I have tried uninstalling and reinstalling the library, but I haven't tried Java (I don't want to mess with anything it has in its current state).
P.S. The file listed at the end of the console response (basic.mp4.txt) was empty because it had not begun writing to the file yet.