libmoldeo (Moldeo 1.0 Core)  1.0
libmoldeo es el conjunto de objetos y funciones, que permiten ejecutar las operaciones básicas de la plataforma Moldeo, y que compone su núcleo.
moGsGraph.cpp
Ir a la documentación de este archivo.
1 /*******************************************************************************
2 
3  moGsGraph.cpp
4 
5  ****************************************************************************
6  * *
7  * This source is free software; you can redistribute it and/or modify *
8  * it under the terms of the GNU General Public License as published by *
9  * the Free Software Foundation; either version 2 of the License, or *
10  * (at your option) any later version. *
11  * *
12  * This code is distributed in the hope that it will be useful, but *
13  * WITHOUT ANY WARRANTY; without even the implied warranty of *
14  * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU *
15  * General Public License for more details. *
16  * *
17  * A copy of the GNU General Public License is available on the World *
18  * Wide Web at <http://www.gnu.org/copyleft/gpl.html>. You can also *
19  * obtain it by writing to the Free Software Foundation, *
20  * Inc., 59 Temple Place - Suite 330, Boston, MA 02111-1307, USA. *
21  * *
22  ****************************************************************************
23 
24  Copyright(C) 2006 Fabricio Costa
25 
26  Authors:
27  Fabricio Costa
28 
29  Gstreamer list of defined types:
30  http://gstreamer.freedesktop.org/data/doc/gstreamer/head/pwg/html/section-types-definitions.html
31 
32 *******************************************************************************/
33 #include "moGsGraph.h"
34 
35 #include <gst/gst.h>
36 
37 #ifndef GSTVERSION
38 #include <gst/interfaces/propertyprobe.h>
39 #else
40 #endif // GSTVERSION
41 #define GSTVERSION
42 #include "moFileManager.h"
43 
45 #ifndef MO_GSTREAMER
46 #define MO_GSTREAMER
47 #endif
48 
49 #ifdef MO_GSTREAMER
50 
51 #ifdef GSTVERSION
52  #include <gst/app/gstappsink.h>
53  #define DECODEBIN "decodebin"
54  #define VIDEOCONVERT "videoconvert"
55 #else
56  #define VIDEOCONVERT "ffmpegcolorspace"
57  #ifdef MO_MACOSX
58  #define DECODEBIN "decodebin"
59  #else
60  #define DECODEBIN "decodebin2"
61  #endif
62 #endif
63 
64 #define USING_SYNC_FRAMEBUFFER
65 
66 static gboolean bus_call ( GstBus *bus, GstMessage *msg, void* user_data)
67 {
68  //cout << "bus_call: new message" << endl;
69  bus = NULL;
70  moGsGraph* pGsGraph = (moGsGraph*) user_data;
71 
72  if (true) {
73  const GstStructure *s;
74 
75  s = gst_message_get_structure ((GstMessage *)msg);
76 
77  /*
78  pGsGraph->MODebug2->Message(
79  moText("moGsGraph:: Got message from element \"")
80  + moText( GST_STR_NULL (GST_ELEMENT_NAME (GST_MESSAGE_SRC (msg))) )
81  + moText("\" (")
82  + moText(gst_message_type_get_name (GST_MESSAGE_TYPE (msg)))
83  + moText(")") );
84 */
85  if (s) {
86  gchar *sstr;
87 
88  sstr = gst_structure_to_string (s);
89  //pGsGraph->MODebug2->Message( moText(sstr) );
90  //g_print ("%s\n", sstr);
91  g_free (sstr);
92  } else {
93 
94  //pGsGraph->MODebug2->Message( moText(" <no message details>") );
95  //g_print ("no message details\n");
96  }
97  }
98 
99  switch (GST_MESSAGE_TYPE (msg))
100  {
101  case GST_MESSAGE_EOS:
102  {
103  //g_message ("End-of-stream");
104  pGsGraph->MODebug2->Message(moText("moGsGraph:: EOS <End-of-stream> "));
105  pGsGraph->SetEOS(true);
106  //g_main_loop_quit (loop);
107  break;
108  }
109 
110  case GST_MESSAGE_ERROR:
111  {
112  gchar *debug;
113 
114  GError *err;
115 
116  gst_message_parse_error ((GstMessage *)msg, &err, &debug);
117  pGsGraph->MODebug2->Error(moText("moGsGraph:: gst message error: ") + moText(debug));
118  g_free (debug);
119  //g_error ("%s", err->message);
120  //g_error_free (err);
121  //g_main_loop_quit (loop);
122 
123  break;
124  }
125 
126  default:
127 
128  break;
129 
130 }
131 
132  return true;
133 }
134 
135 
136 /*
138 typedef struct {
139  GstMiniObject mini_object;
140 
141  // pointer to data and its size
142  guint8 *data;
143  guint size;
144 
145  // timestamp
146  GstClockTime timestamp;
147  GstClockTime duration;
148 
149  // the media type of this buffer
150  GstCaps *caps;
151 
152  // media specific offset
153  guint64 offset;
154  guint64 offset_end;
155 
156  guint8 *malloc_data;
157 } GstBuffer;
158 */
159 
160 
163  moGPointer u_data
164  ) {
165 
166  moGsGraph* pGsGraph;
167 
168  if (u_data!=0) {
169  pGsGraph = (moGsGraph*)u_data;
170  if (pGsGraph) {
171  pGsGraph->MODebug2->Error(moText("moGsGraph::cb_buffer_disconnected !!!"));
172  }
173  }
174 
175  //moAbstract::MODebug2->Error(moText("moGsGraph::cb_buffer_disconnected !!!"));
176  return false;
177 }
178 
206 #ifdef GSTVERSION
208 moGsGraph::appsink_new_sample( moGstAppSink* appsink, moGPointer user_data ) {
209 
210  GstSample* sample;
211  GstMapInfo mapinfo;
212  GstAppSink* psink = (GstAppSink*) appsink;
213  if (!psink) return GST_FLOW_ERROR;
214 
215  moGsGraph* pGsGraph = (moGsGraph*) user_data;
216  int w = pGsGraph->GetVideoFormat().m_Width;
217  int h = pGsGraph->GetVideoFormat().m_Height;
218 
219  if (!pGsGraph) return GST_FLOW_ERROR;
220  //pGsGraph->MODebug2->Message("new sample");
221  moBucket *pbucket=NULL;
222 
223 //pGsGraph->MODebug2->Message(moText("new-sample") );
224 
225  if (1==1) {
226  //GstSample* sample = gst_app_sink_try_pull_sample ( psink, 1000000 );
227  GstSample* sample = gst_app_sink_pull_sample ( psink );
228  //g_signal_emit_by_name (psink, "pull-sample", &sample);
229  //GstSample* sample = gst_base_sink_get_last_sample(GST_BASE_SINK(psink));
230  if (!sample) {
231  pGsGraph->MODebug2->Message(moText("no sample") );
232  return GST_FLOW_ERROR;
233  } else {
234  //pGsGraph->MODebug2->Message(moText("sample!") );
235  }
236 
237  GstCaps* bcaps = gst_sample_get_caps( sample );
238  if (!bcaps) return GST_FLOW_OK;
239 
240  GstBuffer* Gbuffer = gst_sample_get_buffer (sample);
241  int bsize = gst_buffer_get_size( Gbuffer );
242  if (!( bsize>0 && (int)bsize<=(h*w*4) )) return GST_FLOW_ERROR;
243  //pGsGraph->MODebug2->Message(moText("Bucket receiving size: ") + IntToStr(bsize) );
244 
245  if (1==1) {
246 
247  //gst buffer to moldeo bucketpool
248 
249  if (!pGsGraph->m_pBucketsPool) return GST_FLOW_ERROR;
250  if(pGsGraph->m_pBucketsPool->IsFull()) {
251  //pGsGraph->MODebug2->Warning("appsink_new_sample > bckt full");
252  gst_sample_unref(sample);
253  return GST_FLOW_OK;
254  }
255 
256  pbucket = new moBucket();
257  if (pbucket==NULL) return GST_FLOW_ERROR;
258  }
259 
260  if (Gbuffer) {
261  gst_buffer_map ( Gbuffer, &mapinfo, GST_MAP_READ);
262  //GstBuffer* Gbuffer2 = gst_buffer_ref (Gbuffer);
263  //if (Gbuffer2) {
264  //gst_buffer_map ( Gbuffer2, &mapinfo, GST_MAP_READ);
265 
266  //MOubyte color = mapinfo.data[0];
267  //pGsGraph->MODebug2->Message(moText("color: ") + IntToStr(color) );
268  if (bsize) {
269  //pGsGraph->MODebug2->Message(moText("copying: ") + IntToStr(bsize) );
270  //pGsGraph->m_Buckets[0].Copy( bsize, (MOubyte*)mapinfo.data );
271  pbucket->SetBuffer( bsize,(MOubyte*)mapinfo.data );
272  //pbucket->BuildBucket(bsize,128);
273  } else {
274  pGsGraph->MODebug2->Error(moText("m_Buckets size: ") + IntToStr(pGsGraph->m_Buckets[0].GetSize()) + moText(" do not match with buffer size: ") + IntToStr(bsize) );
275  }
276  //gst_buffer_unmap ( Gbuffer2, &mapinfo );
277  //gst_buffer_unref(Gbuffer2);
278 
279  //}
280  gst_buffer_unmap ( Gbuffer, &mapinfo );
281  }
282 
283  if (1==1) {
284 
285 
286  bool added_bucket = pGsGraph->m_pBucketsPool->AddBucket( pbucket );
287  if(!added_bucket)
288  pGsGraph->MODebug2->Error(moText("appsink_new_sample > Bucket error"));
289  //else pGsGraph->MODebug2->Message("bckt added!!"+IntToStr(pGsGraph->m_pBucketsPool->m_nBuckets) );
290  }
291 
292 
293  gst_sample_unref(sample);
294  }
295  return GST_FLOW_OK;
296 }
297 
299 moGsGraph::appsink_new_preroll( moGstAppSink* appsink, moGPointer user_data ) {
300 
301  return 0;
302 }
303 
304 void
305 moGsGraph::appsink_eos( moGstAppSink* appsink, moGPointer user_data ) {
306 
307 }
308 #endif
309 
310 
311 
312 
313 #ifndef GSTVERSION
316 #else
319 #endif
320 {
321  moGsGraph* pGsGraph;
322  pad = NULL;
323  GstStructure* str = NULL;
324  GstBuffer* Gbuffer;
325  GstCaps* caps = NULL;
326  GstPad* Gpad = NULL;
327 
328 #ifndef GSTVERSION
329  Gbuffer = (GstBuffer*)buffer;
330  caps = Gbuffer->caps;
331 #else
332  GstPadProbeInfo* Ginfo = (GstPadProbeInfo*) info;
333  Gbuffer = GST_PAD_PROBE_INFO_BUFFER ( Ginfo );
334  Gpad = (GstPad*)pad;
335  if (Gpad)
336  caps = gst_pad_get_current_caps( Gpad );
337 #endif
338 
339  if (caps)
340  str = gst_caps_get_structure ( (caps), 0);
341  else
342  return false;
343 
344  if (str==NULL)
345  return FALSE;
346 
347 
348  const gchar *sstr;
349  const gchar *strname;
350 
351  strname = gst_structure_get_name( str );
352  sstr = gst_structure_to_string (str);
353 
354  //cout << "new data: timestamp: " << buffer->timestamp << " duration:" << buffer->duration << " size:" << buffer->size << " caps:" << sstr << endl;
355  //moAbstract::MODebug2->Message( moText(" moGsGraph:: cb_have_data") );
356 
357  gchar* isaudio = NULL;
358  gchar* isvideo = NULL;
359 
360  isaudio = g_strrstr (strname, "audio");
361  isvideo = g_strrstr (strname, "video");
362 
363  if (u_data!=0) {
364  pGsGraph = (moGsGraph*)u_data;
365 
366  if (isvideo) {
367  if (pGsGraph->m_VideoFormat.m_WaitForFormat)
368  pGsGraph->SetVideoFormat( caps, Gbuffer );
369  }
370 
371  if (isaudio) {
372  if (pGsGraph->m_AudioFormat.m_WaitForFormat)
373  pGsGraph->SetAudioFormat( caps, Gbuffer );
374  }
375  } else {
376  //moAbstract::MODebug2->Error( moText(" moGsGraph:: cb_have_data error: no user data!!") );
377  return true;//siga intentando
378  }
379 
380  //pGsGraph->MODebug2->Message(moText("moGsGraph::cb_have_data receiving..."));
381 
382  //return true;
383 
384  int w = pGsGraph->GetVideoFormat().m_Width;
385  int h = pGsGraph->GetVideoFormat().m_Height;
386 
387  //cout << "w:" << w << "h:" << h << endl;
388 
389  if (Gbuffer ) {
390  int bsize;
391 #ifndef GSTVERSION
392  bsize = Gbuffer->size;
393 #else
394  bsize = gst_buffer_get_size( Gbuffer );
395 #endif
396  if (isvideo) {
397  if ( bsize>0 && (int)bsize<=(h*w*4) ) {
398  //g_passing buffer to bucketpool
399  moBucket *pbucket=NULL;
400 
401  if (pGsGraph->m_pBucketsPool)
402  if(!pGsGraph->m_pBucketsPool->IsFull()) {
403 
404  //pGsGraph->MODebug2->Message(moText("Bucket receiving size: ") + IntToStr(Gbuffer->size) );
405 
406  pbucket = new moBucket();
407  if(pbucket!=NULL) {
408 
409  //pGsGraph->m_VideoFormat.m_BufferSize = Gbuffer->size;
410  //pGsGraph->m_VideoFormat.m_TimePerFrame = Gbuffer->duration;
411 
412  gint value_numerator, value_denominator;
413  gst_structure_get_fraction( str, "framerate", &value_numerator, &value_denominator );
414 
415  //MOuint frate = (value_numerator * 100) / value_denominator;
416  //MODebug2->Push( " frate: "+ IntToStr(frate) + " timeperframe: " + IntToStr(Gbuffer->duration));
417 #ifndef GSTVERSION
418  pbucket->SetBuffer( bsize,(MOubyte*)Gbuffer->data );
419 #else
420  pbucket->SetBuffer( bsize,(MOubyte*)GST_PAD_PROBE_INFO_DATA(Ginfo) );
421 #endif
422  //pbucket->BuildBucket( w*h*4, 100 );
423  //MODebug2->Push("bucket created.");
424  //gst_buffer_extract( Gbuffer, 0, pbucket->GetBuffer(), Gbuffer->size );
425 
426 
427  if(!pGsGraph->m_pBucketsPool->AddBucket( pbucket )) {
428  pGsGraph->MODebug2->Error(moText("Bucket error"));
429  }// else pGsGraph->MODebug2->Message("bucket Added.");
430  else {
431  //cout << "bckt added!!" << bsize << " #" << pGsGraph->m_pBucketsPool->m_nBuckets << endl;
432  pGsGraph->MODebug2->Message("bucket Added.");
433 
434  }
435  }
436 
437  }
438 
439  } else {
440  pGsGraph->MODebug2->Error( moText(" moGsGraph:: cb_have_data error: wrong buffer size:")
441  + IntToStr(bsize));
442 
443  }
444  }
445  } else {
446  pGsGraph->MODebug2->Error( moText(" moGsGraph:: cb_have_data error: no Gbuffer data!!") );
447  }
448 
449  gst_object_unref( caps );
450 
451  return TRUE;
452 }
453 
454 
455 #ifndef GSTVERSION
456 void
458 
459  rtspsrc = NULL;
460  GstCaps *caps = NULL;
461  GstPadLinkReturn padlink;
462  gchar* padname = NULL;
463  const gchar* strname = NULL;
464  const gchar* medianame = NULL;
465  GstStructure *str = NULL;
466  GstPad* Gpad = (GstPad*) pad;
467  moGsGraph* pGsGraph;
468 
469  if (gst_pad_is_linked(Gpad)) {
470  return;
471  }
472  if (u_data!=0) {
473  pGsGraph = (moGsGraph*)u_data;
474 
475  /* check media type */
476  caps = gst_pad_get_caps (Gpad);
477  padname = gst_pad_get_name(Gpad);
478  if (padname) {
479  str = gst_caps_get_structure (caps, 0);
480 
481  const gchar *sstr;
482 
483  sstr = gst_structure_to_string (str);
484  strname = gst_structure_get_name (str);
485  medianame = gst_structure_get_string (str, "media");
486  //strname = GST_STRUCTURE(str)->has_field("media");
487 
488  moText dbgstr = medianame;
489  pGsGraph->MODebug2->Push( dbgstr );
490 
491  if (g_strrstr (medianame, "video")) {
493  if ( pGsGraph->m_pRTSPDepaySink ) {
494  padlink = gst_pad_link ( Gpad, (GstPad*)pGsGraph->m_pRTSPDepaySink);
495  if (padlink==GST_PAD_LINK_OK) {
497  }
498  } else
499  if ( pGsGraph->m_pHTTPSource ) {
500  padlink = gst_pad_link ( Gpad, (GstPad*)pGsGraph->m_pDecoderBin );
501  if (padlink==GST_PAD_LINK_OK) {
503  }
504  }
505  }
506 
507  }
508  }
509 
510 }
511 
512 #else
513 
514 #endif
515 
516 
517 #ifndef GSTVERSION
518 void
519 moGsGraph::cb_newpad ( moGstElement *decodebin, moGstPad *pad, moGBoolean last, moGPointer u_data)
520 #else
521 void
522 moGsGraph::cb_pad_added_new ( moGstElement *decodebin, moGstPad *pad, moGPointer u_data)
523 #endif
524 {
525  decodebin = NULL;
526 #ifndef GSTVERSION
527  last = false;
528 #endif
529  GstCaps *caps = NULL;
530  GstPad *videopad = NULL;
531 // GstPad *audiopad = NULL;
532  GstPad *audiopadinconverter = NULL;
533  GstPadLinkReturn padlink;
534  gchar* padname = NULL;
535  const gchar* strname = NULL;
536  GstStructure *str = NULL;
537  GstPad* Gpad = (GstPad*) pad;
538 
539  moGsGraph* pGsGraph = NULL;
540  GstElement* SinkElement = NULL;
541 
542  cout << "cb_pad_added_new" << endl;
543 
544 
545  if (gst_pad_is_linked(Gpad)) {
546  cout << "cb_pad_added_new already linked!" << endl;
547  return;
548  }
549 
550 
551  if (u_data!=0) {
552  pGsGraph = (moGsGraph*)u_data;
553  /* check media type */
554 #ifndef GSTVERSION
555  caps = gst_pad_get_caps (Gpad);
556 #else
557  caps = gst_pad_get_current_caps(Gpad);
558 #endif
559  padname = gst_pad_get_name(Gpad);
560  if (padname) {
561  str = gst_caps_get_structure (caps, 0);
562 
563  const gchar *sstr=NULL;
564  if (str) {
565  sstr = gst_structure_to_string (str);
566  cout << "cb_pad_added_new: new pad: " << padname << "caps:" << sstr << endl;
567  } else {
568  MODebug2->Error(moText("moGsGraph::cb_pad_added_new > gst_caps_get_structure is empty") );
569  }
570 
571  if (sstr==NULL) {
572  MODebug2->Error(moText("moGsGraph::cb_pad_added_new > sstr gst_structure_to_string is empty") );
573  } else strname = gst_structure_get_name (str);
574  cout << "cb_newpad: new pad: " << padname << "strname:" << strname << endl;
575 
576  bool is_rtsp = false;
577  if (g_strrstr (strname, "application/x-rtp")) {
578  is_rtsp = true;
579  strname = gst_structure_get_string(str,"media");
580  cout << "application/x-rtp: media: " << strname << endl;
581  }
582 
583 
584  bool forcing_video = false;
585  bool is_video = false;
586  bool is_audio = false;
587  if (strname==NULL) {
588  //cout << "cb_newpad: strname==NULL" << endl;
589  MODebug2->Error(moText("moGsGraph::cb_pad_added_new > gst_structure_to_string is empty, forcing video!") );
590  //return;
591  forcing_video = true;
592  } else {
593  is_video = g_strrstr (strname, "video");
594  is_audio = g_strrstr (strname, "audio");
595  }
596 
597 
598 
599 
600  if (is_audio) {
601  pGsGraph->m_pAudioPad = Gpad;
602 
603  MODebug2->Message(moText("moGsGraph::cb_pad_added_new: audio pad created > building filters"));
604 
607 
608  if (pGsGraph->m_pAudioConverter) {
609 #ifndef GSTVERSION
610  audiopadinconverter = gst_element_get_pad ( (GstElement*) pGsGraph->m_pAudioConverter, "sink");
611 #else
612  MODebug2->Message(moText("moGsGraph::cb_pad_added_new: get static pad sink audio converter"));
613 audiopadinconverter = gst_element_get_static_pad ( (GstElement*) pGsGraph->m_pAudioConverter, "sink");
614 #endif
615  MODebug2->Message(moText("moGsGraph::cb_pad_added_new: audio pad link"));
616  padlink = gst_pad_link (Gpad, audiopadinconverter);
617 
618  MODebug2->Message(moText("moGsGraph::cb_pad_added_new: srcAudio"));
619  GstPad* srcAudio = gst_element_get_static_pad ( (GstElement*)pGsGraph->m_pAudioConverter, "src");
620 
621  MODebug2->Message(moText("moGsGraph::cb_pad_added_new: pad link ok?"));
622  if (padlink==GST_PAD_LINK_OK) {
623  MODebug2->Message(moText("moGsGraph::cb_pad_added_new: pad link is ok GST_PAD_LINK_OK"));
624 #ifndef GSTVERSION
625  pGsGraph->cb_have_data_handler_id = gst_pad_add_buffer_probe_full ( srcAudio, G_CALLBACK (cb_have_data), pGsGraph, (GDestroyNotify) (cb_buffer_disconnected) );
626 #else
627  /*pGsGraph->cb_have_data_handler_id = gst_pad_add_probe ( srcAudio,
628  GST_PAD_PROBE_TYPE_BUFFER,
629  (GstPadProbeCallback) cb_have_data,
630  pGsGraph,
631  (GDestroyNotify) (cb_buffer_disconnected) );*/
632 #endif
633  }
634 
635  } else if (pGsGraph->m_pAudioSink) {
636  audiopadinconverter = gst_element_get_static_pad ( (GstElement*) pGsGraph->m_pAudioSink, "sink");
637  padlink = gst_pad_link (Gpad, audiopadinconverter);
638  }
639 
640 
641  } else if (is_video || forcing_video ) {
642  pGsGraph->m_pVideoPad = Gpad;
643 
644  MODebug2->Message(moText("moGsGraph::cb_pad_added_new: video pad created"));
645  if (is_rtsp) {
646  videopad = (GstPad*)pGsGraph->m_pRTSPDepaySink;
647  if (videopad) {
648  padlink = gst_pad_link( Gpad, videopad );
649  }
650  if (padlink==GST_PAD_LINK_OK) {
651  cout << "moGsGraph::cb_pad_added_new: linked rtsp source with rtsp depay" << endl;
652  } else {
653  cout << "moGsGraph::cb_pad_added_new: ERROR: UNlinked rtsp source with rtsp depay" << endl;
654  }
655  } else
656  if (pGsGraph->m_pVideoScale==NULL) {
657  //version directa a videoscale
658  if (!(GstElement*)pGsGraph->m_pColorSpaceInterlace) {
659  SinkElement = (GstElement*)pGsGraph->m_pColorSpace;
660  } else {
661  SinkElement = (GstElement*)pGsGraph->m_pColorSpaceInterlace;
662  }
663 #ifndef GSTVERSION
664  videopad = gst_element_get_pad ( SinkElement, "sink");
665  if (videopad) {
666  padlink = gst_pad_link( Gpad, videopad );
667  }
668 #else
669  videopad = gst_element_get_static_pad( SinkElement, "sink");
670  if (videopad) {
671  padlink = gst_pad_link( Gpad, videopad );
672  }
673 #endif
674  //version con deinterlace
675  //videopad = gst_element_get_pad ( (GstElement*)pGsGraph->m_pVideoDeinterlace, "sink");
676 
677  //bool res = gst_pad_set_caps( gst_element_get_pad ( pGsGraph->m_pColorSpace, "src"), gst_caps_new_simple ("video/x-raw-rgb","bpp", G_TYPE_INT, 24, NULL) );
678 
679  if (padlink==GST_PAD_LINK_OK) {
680 // caps = gst_pad_get_caps( Gpad );
681  //pGsGraph->SetVideoFormat(caps);
682 #ifndef GSTVERSION
683  GstPad* srcRGB = gst_element_get_pad ( (GstElement*)pGsGraph->m_pColorSpace, "src");
684  pGsGraph->cb_have_data_handler_id = gst_pad_add_buffer_probe_full ( srcRGB, G_CALLBACK (cb_have_data), pGsGraph, (GDestroyNotify) (cb_buffer_disconnected) );
685 #else
686 
687  MODebug2->Message(moText("moGsGraph::cb_pad_added_new > padlink success, rock and rolling live video.") );
688 
689  GstPad* srcRGB = gst_element_get_static_pad ( (GstElement*)pGsGraph->m_pFakeSink, "sink");
690  /*
691  pGsGraph->cb_have_data_handler_id = gst_pad_add_probe ( srcRGB,
692  GST_PAD_PROBE_TYPE_BUFFER,
693  (GstPadProbeCallback) cb_have_data,
694  pGsGraph,
695  (GDestroyNotify) (cb_buffer_disconnected) );
696  */
697 #endif
698  //cout << "cb_newpad: linked pads..." << endl;
699  } else MODebug2->Error(moText("moGsGraph::cb_pad_added_new > padlink BAD!") );
700 
701  } else {
702  //version 2 con videoscale
703 
704  //version directa a videoscale
705 #ifndef GSTVERSION
706 videopad = gst_element_get_pad ( (GstElement*)pGsGraph->m_pVideoScale, "sink");
707 #else
708 videopad = gst_element_get_static_pad ( (GstElement*)pGsGraph->m_pVideoScale, "sink");
709 #endif
710  //version con deinterlace
711  //videopad = gst_element_get_pad ( (GstElement*)pGsGraph->m_pVideoDeinterlace, "sink");
712  //bool res = gst_pad_set_caps( gst_element_get_pad ( pGsGraph->m_pColorSpace, "src"), gst_caps_new_simple ("video/x-raw-rgb","bpp", G_TYPE_INT, 24, NULL) );
713 
714  padlink = gst_pad_link( Gpad, videopad );
715 
716  if (padlink==GST_PAD_LINK_OK) {
717  //caps = gst_pad_get_caps( Gpad );
718  //pGsGraph->SetVideoFormat(caps);
719 #ifndef GSTVERSION
720  GstPad* srcRGB = gst_element_get_pad ( (GstElement*)pGsGraph->m_pColorSpace, "src");
721  pGsGraph->cb_have_data_handler_id = gst_pad_add_buffer_probe_full ( srcRGB, G_CALLBACK (cb_have_data), pGsGraph, (GDestroyNotify) (cb_buffer_disconnected) );
722 #else
723  GstPad* srcRGB = gst_element_get_static_pad ( (GstElement*)pGsGraph->m_pColorSpace, "src");
724  pGsGraph->cb_have_data_handler_id = gst_pad_add_probe ( srcRGB,
725  GST_PAD_PROBE_TYPE_BUFFER,
726  (GstPadProbeCallback) cb_have_data,
727  pGsGraph,
728  (GDestroyNotify) (cb_buffer_disconnected) );
729 #endif
730  //cout << "cb_newpad: linked pads..." << endl;
731  }
732  }
733  }
734  }
735 
736  }
737 
738 }
739 
740 
741 //#ifndef GSTVERSION
742 void
744 {
745  decodebin = NULL;
746  GstCaps *caps = NULL;
747  GstPad *videopad = NULL;
748 // GstPad *audiopad = NULL;
749  GstPad *audiopadinconverter = NULL;
750  GstPadLinkReturn padlink;
751  gchar* padname = NULL;
752  const gchar* strname = NULL;
753  GstStructure *str = NULL;
754  GstPad* Gpad = (GstPad*) pad;
755 
756  moGsGraph* pGsGraph;
757  GstElement* SinkElement = NULL;
758 
759  cout << "pad added" << endl;
760  if (gst_pad_is_linked(Gpad)) {
761  return;
762  }
763 
764 
765  if (u_data!=0) {
766  pGsGraph = (moGsGraph*)u_data;
767  /* check media type */
768 #ifndef GSTVERSION
769  caps = gst_pad_get_caps (Gpad);
770 #else
771  caps = gst_pad_get_current_caps(Gpad);
772 #endif
773  padname = gst_pad_get_name(Gpad);
774  if (padname) {
775  str = gst_caps_get_structure (caps, 0);
776 
777  const gchar *sstr;
778 
779  sstr = gst_structure_to_string (str);
780  cout << "cb_newpad: new pad: " << padname << " caps:" << sstr << endl;
781 
782  strname = gst_structure_get_name (str);
783  cout << "cb_newpad: new pad: " << padname << " strname:" << strname << endl;
784 
785 
786  if (g_strrstr (strname, "audio")) {
787  pGsGraph->m_pAudioPad = Gpad;
788 
789  MODebug2->Message("moGsGraph::cb_pad_added: audio pad created > creating audio filters!");
790 
794  if (pGsGraph->m_pAudioConverter) {
795 
796  gboolean link_audioresult = gst_element_link_many( (GstElement*)pGsGraph->m_pAudioConverter,
797  (GstElement*)pGsGraph->m_pAudioVolume,
798  (GstElement*)pGsGraph->m_pAudioPanorama,
799  (GstElement*)pGsGraph->m_pAudioSink, NULL );
800  if (link_audioresult) {
801 #ifndef GSTVERSION
802  audiopadinconverter = gst_element_get_pad ( (GstElement*) pGsGraph->m_pAudioConverter, "sink");
803 #else
804  audiopadinconverter = gst_element_get_static_pad( (GstElement*) pGsGraph->m_pAudioConverter, "sink");
805 #endif
806  padlink = gst_pad_link (Gpad, audiopadinconverter);
807 
808 #ifndef GSTVERSION
809  GstPad* srcAudio = gst_element_get_pad ( (GstElement*)pGsGraph->m_pAudioConverter, "src");
810 #else
811  GstPad* srcAudio = gst_element_get_static_pad( (GstElement*)pGsGraph->m_pAudioConverter, "src");
812 #endif
813  if (padlink==GST_PAD_LINK_OK) {
814 #ifndef GSTVERSION
815  pGsGraph->cb_have_data_handler_id = gst_pad_add_buffer_probe_full ( srcAudio, G_CALLBACK (cb_have_data), pGsGraph, (GDestroyNotify) (cb_buffer_disconnected) );
816 #else
817  pGsGraph->cb_have_data_handler_id = gst_pad_add_probe( srcAudio,
818  GST_PAD_PROBE_TYPE_BUFFER,
819  (GstPadProbeCallback) cb_have_data,
820  pGsGraph,
821  (GDestroyNotify) (cb_buffer_disconnected) );
822 #endif
823 
824  }
825  }
826  } else if (pGsGraph->m_pAudioSink && 1==1) {
827 #ifndef GSTVERSION
828  audiopadinconverter = gst_element_get_pad ( (GstElement*) pGsGraph->m_pAudioSink, "sink");
829 #else
830  audiopadinconverter = gst_element_get_static_pad ( (GstElement*) pGsGraph->m_pAudioSink, "sink");
831 #endif
832  padlink = gst_pad_link (Gpad, audiopadinconverter);
833  }
834 
835 
836  } else if (g_strrstr (strname, "video")) {
837  pGsGraph->m_pVideoPad = Gpad;
838 
839  //MODebug2->Push(moText("moGsGraph::cb_pad_added: video pad created"));
840  cout << "is video" << endl;
841  if (pGsGraph->m_pVideoScale==NULL) {
842  //version directa a videoscale
843  if (!(GstElement*)pGsGraph->m_pColorSpaceInterlace) {
844  SinkElement = (GstElement*)pGsGraph->m_pColorSpace;
845  cout << "SinkElement: m_pColorSpace" << endl;
846  } else {
847  SinkElement = (GstElement*)pGsGraph->m_pColorSpaceInterlace;
848  }
849 #ifndef GSTVERSION
850  videopad = gst_element_get_pad ( SinkElement, "sink");
851  if (videopad) {
852  padlink = gst_pad_link( Gpad, videopad );
853  }
854 #else
855  videopad = gst_element_get_static_pad( SinkElement, "sink");
856  if (videopad) {
857  padlink = gst_pad_link( Gpad, videopad );
858  }
859 #endif
860  //version con deinterlace
861  //videopad = gst_element_get_pad ( (GstElement*)pGsGraph->m_pVideoDeinterlace, "sink");
862 
863  //bool res = gst_pad_set_caps( gst_element_get_pad ( pGsGraph->m_pColorSpace, "src"), gst_caps_new_simple ("video/x-raw-rgb","bpp", G_TYPE_INT, 24, NULL) );
864 
865  if (padlink==GST_PAD_LINK_OK) {
866 #ifndef GSTVERSION
867  //pGsGraph->SetVideoFormat(caps);
868  GstPad* srcRGB = gst_element_get_pad ( (GstElement*)pGsGraph->m_pColorSpace, "src");
869  pGsGraph->cb_have_data_handler_id = gst_pad_add_buffer_probe_full ( srcRGB, G_CALLBACK (cb_have_data), pGsGraph, (GDestroyNotify) (cb_buffer_disconnected) );
870  //cout << "cb_newpad: linked pads..." << endl;
871 #else
872  GstPad* srcRGB = gst_element_get_static_pad ( (GstElement*)pGsGraph->m_pColorSpace, "src");
873  pGsGraph->cb_have_data_handler_id = gst_pad_add_probe ( srcRGB,
874  GST_PAD_PROBE_TYPE_BUFFER,
875  (GstPadProbeCallback) cb_have_data,
876  pGsGraph,
877  (GDestroyNotify) (cb_buffer_disconnected) );
878  //cout << "SinkElement linked" << endl;
879 #endif
880  }
881  } else {
882  //version 2 con videoscale
883 
884  //version directa a videoscale
885 #ifndef GSTVERSION
886 videopad = gst_element_get_pad ( (GstElement*)pGsGraph->m_pVideoScale, "sink");
887 #else
888 videopad = gst_element_get_static_pad ( (GstElement*)pGsGraph->m_pVideoScale, "sink");
889 #endif
890  //version con deinterlace
891  //videopad = gst_element_get_pad ( (GstElement*)pGsGraph->m_pVideoDeinterlace, "sink");
892  //bool res = gst_pad_set_caps( gst_element_get_pad ( pGsGraph->m_pColorSpace, "src"), gst_caps_new_simple ("video/x-raw-rgb","bpp", G_TYPE_INT, 24, NULL) );
893 
894  padlink = gst_pad_link( Gpad, videopad );
895 
896  if (padlink==GST_PAD_LINK_OK) {
897  //caps = gst_pad_get_caps( Gpad );
898  //pGsGraph->SetVideoFormat(caps);
899 #ifndef GSTVERSION
900  GstPad* srcRGB = gst_element_get_pad ( (GstElement*)pGsGraph->m_pColorSpace, "src");
901  pGsGraph->cb_have_data_handler_id = gst_pad_add_buffer_probe_full ( srcRGB, G_CALLBACK (cb_have_data), pGsGraph, (GDestroyNotify) (cb_buffer_disconnected) );
902 #else
903  GstPad* srcRGB = gst_element_get_static_pad ( (GstElement*)pGsGraph->m_pColorSpace, "src");
904  pGsGraph->cb_have_data_handler_id = gst_pad_add_probe ( srcRGB,
905  GST_PAD_PROBE_TYPE_BUFFER,
906  (GstPadProbeCallback) cb_have_data,
907  pGsGraph,
908  (GDestroyNotify) (cb_buffer_disconnected) );
909 #endif
910  //cout << "cb_newpad: linked pads..." << endl;
911  }
912  }
913  }
914  }
915 
916  }
917 
918 }
919 //#else
920 //#endif
921 
922 #ifndef GSTVERSION
923 void
925  moGstBuffer *buffer,
926  moGstPad *pad,
927  moGPointer user_data)
928 {
929  static gboolean white = FALSE;
930 
931  GstElement* Gfakesrc = (GstElement*)fakesrc;
932  GstBuffer* Gbuffer = (GstBuffer*)buffer;
933  GstPad* Gpad = (GstPad*)pad;
934  Gpad = NULL;
935  Gfakesrc = NULL;
936  moGsGraph* pGsGraph;
937 
938 
939  if (user_data!=0) {
940  pGsGraph = (moGsGraph*)user_data;
942  //memset (GST_BUFFER_DATA (buffer), white ? 0x44 : 0x0, GST_BUFFER_SIZE (buffer));
943  pGsGraph->CopyVideoFrame( GST_BUFFER_DATA (Gbuffer), GST_BUFFER_SIZE (Gbuffer) );
944  //memcpy( GST_BUFFER_DATA (buffer), (void*)pGsGraph->GetVideoFrame(), GST_BUFFER_SIZE (buffer) );
945  } else {
947  memset (GST_BUFFER_DATA (Gbuffer), white ? 0xff : 0x0, GST_BUFFER_SIZE (Gbuffer));
948  }
949 
950 
951 
952 
953  GstCaps *caps;
954 
955  caps = gst_caps_new_simple ("video/x-raw-rgb", "width", G_TYPE_INT, 400,
956  "height", G_TYPE_INT, 300,
957  "bpp", G_TYPE_INT, 24,
958  "depth", G_TYPE_INT, 24,
959  "framerate", GST_TYPE_FRACTION, 10, 1,
960  NULL);
961  gst_buffer_set_caps (Gbuffer, caps);
962  gst_caps_unref (caps);
963  /* this makes the image black/white */
964 
965 
966  white = !white;
967 
968 }
969 #else
970 #endif
971 
972 /* returns TRUE if there was an error or we caught a keyboard interrupt. */
973 static gboolean
974 event_loop (GstElement * pipeline, gboolean blocking, GstState target_state)
975 {
976  GstBus *bus;
977  GstMessage *message = NULL;
978  gboolean res = FALSE;
979  gboolean buffering = FALSE;
980 
981  bus = gst_element_get_bus (GST_ELEMENT (pipeline));
982 
983  if (!bus) exit(1);
984 
985  while (TRUE) {
986  message = gst_bus_poll (bus, GST_MESSAGE_ANY, blocking ? -1 : 0);
987 
988  /* if the poll timed out, only when !blocking */
989  if (message == NULL)
990  goto exit;
991 
992  /* check if we need to dump messages to the console */
993  if (true) {
994  const GstStructure *s;
995 
996  s = gst_message_get_structure (message);
997 
998  g_print (("Got Message from element \"%s\" (%s): "),
999  GST_STR_NULL (GST_ELEMENT_NAME (GST_MESSAGE_SRC (message))),
1000  gst_message_type_get_name (GST_MESSAGE_TYPE (message)));
1001  if (s) {
1002  gchar *sstr;
1003 
1004  sstr = gst_structure_to_string (s);
1005  g_print ("%s\n", sstr);
1006  g_free (sstr);
1007  } else {
1008  g_print ("no message details\n");
1009  }
1010  }
1011 
1012  switch (GST_MESSAGE_TYPE (message)) {
1013 
1014  case GST_MESSAGE_WARNING:{
1015  GError *gerror;
1016  gchar *debug;
1017  gchar *name = gst_object_get_path_string (GST_MESSAGE_SRC (message));
1018 
1019  /* dump graph on warning */
1020  GST_DEBUG_BIN_TO_DOT_FILE_WITH_TS (GST_BIN (pipeline),
1021  GST_DEBUG_GRAPH_SHOW_ALL, "gst-launch.warning");
1022 
1023  gst_message_parse_warning (message, &gerror, &debug);
1024  g_print (("WARNING: from element %s: %s\n"), name, gerror->message);
1025  if (debug) {
1026  g_print (("Additional debug info:\n%s\n"), debug);
1027  }
1028  g_error_free (gerror);
1029  g_free (debug);
1030  g_free (name);
1031  break;
1032  }
1033  case GST_MESSAGE_ERROR:{
1034  GError *gerror;
1035  gchar *debug;
1036 
1037  /* dump graph on error */
1038  GST_DEBUG_BIN_TO_DOT_FILE_WITH_TS (GST_BIN (pipeline),
1039  GST_DEBUG_GRAPH_SHOW_ALL, "gst-launch.error");
1040 
1041  gst_message_parse_error (message, &gerror, &debug);
1042  gst_object_default_error (GST_MESSAGE_SRC (message), gerror, debug);
1043  g_error_free (gerror);
1044  g_free (debug);
1045  /* we have an error */
1046  res = TRUE;
1047  goto exit;
1048  }
1049  case GST_MESSAGE_STATE_CHANGED:{
1050  GstState old, mnew, pending;
1051 
1052  gst_message_parse_state_changed (message, &old, &mnew, &pending);
1053 
1054  /* debug each state change
1055  GST_DEBUG_BIN_TO_DOT_FILE_WITH_TS (GST_BIN (pipeline), GST_DEBUG_GRAPH_SHOW_ALL, "gst-launch");
1056  */
1057 
1058  /* we only care about pipeline state change messages */
1059  if (GST_MESSAGE_SRC (message) != GST_OBJECT_CAST (pipeline))
1060  break;
1061 
1062  /* debug only overall state changes
1063  {
1064  gchar *dump_name;
1065 
1066  dump_name = g_strdup_printf ("gst-launch.%s",gst_element_state_get_name (new);
1067  GST_DEBUG_BIN_TO_DOT_FILE_WITH_TS (GST_BIN (pipeline), GST_DEBUG_GRAPH_SHOW_ALL, dump_name);
1068  g_free (dump_name);
1069  }
1070  */
1071 
1072  /* ignore when we are buffering since then we mess with the states
1073  * ourselves. */
1074  if (buffering) {
1075  fprintf (stderr,
1076  ("Prerolled, waiting for buffering to finish...\n"));
1077  break;
1078  }
1079 
1080  /* if we reached the final target state, exit */
1081  if (target_state == GST_STATE_PAUSED && mnew == target_state)
1082  goto exit;
1083 
1084  /* else not an interesting message */
1085  break;
1086  }
1087  case GST_MESSAGE_BUFFERING:{
1088  gint percent;
1089 
1090  gst_message_parse_buffering (message, &percent);
1091  fprintf (stderr, ("buffering... %d \r"), percent);
1092 
1093  /* no state management needed for live pipelines */
1094  /*
1095  if (is_live)
1096  break;
1097  */
1098 
1099  if (percent == 100) {
1100  /* a 100% message means buffering is done */
1101  buffering = FALSE;
1102  /* if the desired state is playing, go back */
1103  if (target_state == GST_STATE_PLAYING) {
1104  fprintf (stderr,
1105  ("Done buffering, setting pipeline to PLAYING ...\n"));
1106  gst_element_set_state (pipeline, GST_STATE_PLAYING);
1107  } else
1108  goto exit;
1109  } else {
1110  /* buffering busy */
1111  if (buffering == FALSE && target_state == GST_STATE_PLAYING) {
1112  /* we were not buffering but PLAYING, PAUSE the pipeline. */
1113  fprintf (stderr, ("Buffering, setting pipeline to PAUSED ...\n"));
1114  gst_element_set_state (pipeline, GST_STATE_PAUSED);
1115  }
1116  buffering = TRUE;
1117  }
1118  break;
1119  }
1120  case GST_MESSAGE_APPLICATION:{
1121  const GstStructure *s;
1122 
1123  s = gst_message_get_structure (message);
1124 
1125  if (gst_structure_has_name (s, "GstLaunchInterrupt")) {
1126  /* this application message is posted when we caught an interrupt and
1127  * we need to stop the pipeline. */
1128  fprintf (stderr, ("Interrupt: Stopping pipeline ...\n"));
1129  /* return TRUE when we caught an interrupt */
1130  res = TRUE;
1131  goto exit;
1132  }
1133  }
1134  default:
1135  /* just be quiet by default */
1136  break;
1137  }
1138  if (message)
1139  gst_message_unref (message);
1140  }
1141  g_assert_not_reached ();
1142 
1143 exit:
1144  {
1145  if (message)
1146  gst_message_unref (message);
1147  gst_object_unref (bus);
1148  return res;
1149  }
1150 }
1151 
1152 
1153 
1154 //===========================================
1155 //
1156 // Class: moGsFramewwork
1157 //
1158 //===========================================
1159 
1160 //GMainLoop *moGsGraph::loop = g_main_loop_new (NULL, FALSE);
1161 
1163  // m_pDevEnum = NULL;
1164  // m_pEnum = NULL;
1165 }
1166 
1167 
1169 
1170 
1171 
1172 }
1173 
1242 moCaptureDevices* moGsFramework::LoadCaptureDevices() {
1243 
1244  GstElement* device;
1245  #ifndef GSTVERSION
1246  GstPropertyProbe* probe;
1247  #endif
1248  GValueArray* va;
1249  GList *plist;
1250  GParamSpec* pm;
1251  GValue* vdefault;
1252  GValue valDef = { 0, };
1253  //GList* list=NULL;
1254  //guint i=0;
1255  gchar* device_name;
1256 
1257  MODebug2->Message( "moGsFramework::LoadCaptureDevices running..." );
1258 
1259  m_CaptureDevices.Empty();
1260 
1261  for( int i=0; i<m_PreferredDevices.Count(); i++) {
1262  MODebug2->Message( moText("moGsFramework::PreferredDevices: ") + IntToStr(i)
1263  + moText(" Name: ") + m_PreferredDevices[i].GetName()
1264  + moText(" LabelName: ") + m_PreferredDevices[i].GetLabelName()
1265  + moText(" Path: ") + m_PreferredDevices[i].GetPath()
1266  + moText(" Port: ") + IntToStr(m_PreferredDevices[i].GetPort())
1267  + moText(" W: ") + IntToStr(m_PreferredDevices[i].GetSourceWidth())
1268  + moText(" H: ") + IntToStr(m_PreferredDevices[i].GetSourceHeight())
1269  + moText(" FlipH: ") + IntToStr(m_PreferredDevices[i].GetSourceFlipH())
1270  + moText(" FlipV: ") + IntToStr(m_PreferredDevices[i].GetSourceFlipV())
1271  + moText(" Bpp: ") + IntToStr(m_PreferredDevices[i].GetSourceBpp()) );
1272  }
1273 
1274 if (m_PreferredDevices.Count()==0) {
1275  moText cap_dev_name = moText("default");
1276  moCaptureDevice newdev;
1277  newdev.Present(true);
1278 
1279  newdev.SetName(cap_dev_name);
1280  newdev.SetLabelName("LIVEIN"+IntToStr(m_CaptureDevices.Count()));
1281 
1282  m_PreferredDevices.Add( newdev );
1283  }
1284  #ifdef MO_WIN32
1285  //m_CaptureDevices.Add( moCaptureDevice( moText("Laptop Integrated Webcam"), moText("webcam"), moText("-") ) );
1286  //m_CaptureDevices.Add( moCaptureDevice( moText("Default"), moText("-"), moText("-") ) );
1287  #ifdef GSTVERSION
1288  moText dname( "ksvideosrc" );
1289  #else
1290  moText dname( "dshowvideosrc" );
1291  #endif
1292  device_name = dname;
1293 
1294  for( MOuint i=0; i<m_PreferredDevices.Count();i++) {
1295  moCaptureDevice CaptDev = m_PreferredDevices[i];
1296  CaptDev.SetLabelName("LIVEIN"+IntToStr(m_CaptureDevices.Count()));
1297  AddCaptureDevice( CaptDev );
1298  MODebug2->Message( "moGsFramework::LoadCaptureDevices > Added preferred device: " + CaptDev.GetLabelName() );
1299  }
1300  //m_CaptureDevices.Add( moCaptureDevice( moText("Laptop Integrated Webcam"), moText("webcam"), moText("-") ) );
1301  //m_CaptureDevices.Add( moCaptureDevice( moText("Microsoft DV Camera and VCR"), moText("DV IEEE 1394"), moText("-"), 0 ) );
1302  //m_CaptureDevices.Add( moCaptureDevice( moText("VideoCAM Messenger"), moText("webcam"), moText("-") ) );
1303  //DIRECT SHOW TEST//
1304  /*
1305  HRESULT hr;
1306 
1307  // Create the System Device Enumerator.
1308  if(m_pDevEnum==NULL) {
1309  HRESULT hr = CoCreateInstance(CLSID_SystemDeviceEnum, NULL,
1310  CLSCTX_INPROC_SERVER, IID_ICreateDevEnum,
1311  reinterpret_cast<void**>(&m_pDevEnum));
1312 
1313  if(SUCCEEDED(hr) && m_pEnum==NULL)
1314  {
1315  // Create an enumerator for the video capture category.
1316  hr = m_pDevEnum->CreateClassEnumerator(
1317  CLSID_VideoInputDeviceCategory,
1318  &m_pEnum, 0);
1319  } else {
1320  ShowError(hr);
1321  return &m_CaptureDevices;
1322  }
1323  }
1324  */
1325  #else
1326  #ifdef MO_MACOSX
1327  device_name = "wrappercamerabinsrc";
1328  #else
1329  device_name = moText("v4l2src");
1330  //m_CaptureDevices.Add( moCaptureDevice( moText("Default"), moText("default"), moText("/dev/video0") ) );
1331  #endif
1332  // in linux: for v4l2src device could be /dev/video0 - /dev/video1 etc...
1333  //m_CaptureDevices.Add( moCaptureDevice( moText("Default"), moText("default") );
1334  //m_CaptureDevices.Add( moCaptureDevice( moText("Laptop Integrated Webcam"), moText("webcam"), moText("/dev/video0") ) );
1335  //m_CaptureDevices.Add( moCaptureDevice( moText("DV"), moText("DV IEEE 1394"), moText("-"), 0 ) );
1336 
1337  for(int i=0; i<m_PreferredDevices.Count();i++) {
1338  moCaptureDevice CaptDev = m_PreferredDevices[i];
1339  CaptDev.SetLabelName("LIVEIN"+IntToStr(m_CaptureDevices.Count()));
1340  AddCaptureDevice( CaptDev );
1341  MODebug2->Message( "moGsFramework::LoadCaptureDevices > Added preferred device: " + CaptDev.GetLabelName() );
1342  }
1343 
1344 
1345 
1346  #endif
1347 
1348 
1349 #ifndef GSTVERSION
1350  try {
1351 
1352  device = gst_element_factory_make (device_name, "source");
1353  gst_element_get_state(device, NULL, NULL, 5 * GST_SECOND);
1354  moText probepname = "device-name";
1355  if (!device || !GST_IS_PROPERTY_PROBE(device))
1356  goto finish;
1357  probe = GST_PROPERTY_PROBE (device);
1358  if (probe) {
1359  plist = (GList *)gst_property_probe_get_properties( probe );
1360  if (plist) {
1361  plist = (GList *)g_list_first(plist);
1362  do {
1363  pm = (GParamSpec *)plist->data;
1364  if (pm) {
1365  if (pm->name) {
1366  probepname = moText((char*)pm->name);
1367  MODebug2->Message( "moGsFramework::LoadCaptureDevices > probe property:"+probepname);
1368  va = gst_property_probe_get_values(probe, pm);
1369  if (va) {
1370  MODebug2->Message( "moGsFramework::LoadCaptureDevices > probe property:"+probepname+" has values!");
1371  }
1372  }
1373  }
1374  } while( plist=g_list_next(plist) );
1375  }
1376  }
1377  va = gst_property_probe_get_values_name (probe, (char*)probepname);
1378  //va = gst_property_probe_get_values_name (probe, "device");
1379  if (!va) {
1380  //TRY TO SET DEFAULT VALUE FROM PARAM SPEC
1381  g_value_init( &valDef, G_PARAM_SPEC_VALUE_TYPE(pm) );
1382  //vdefault = g_param_spec_get_default_value ( pm );
1383  g_param_value_set_default( pm, &valDef );
1384  vdefault = &valDef;
1385  if (vdefault) {
1386  moText defaultText(g_value_get_string( vdefault ));
1387  MODebug2->Message("moGsFramework::LoadCaptureDevices > Default value for: \""+moText((char*)probepname)+"\" is "+defaultText);
1388  //G_VALUE_TYPE_NAME(vdefault);
1389  moText cap_dev_name = defaultText;
1390  moCaptureDevice newdev;
1391  newdev.Present(true);
1392 
1393  newdev.SetName(cap_dev_name);
1394  newdev.SetLabelName("LIVEIN"+IntToStr(m_CaptureDevices.Count()));
1395 
1396  m_CaptureDevices.Add( newdev );
1397 
1398  MODebug2->Message( "moGsFramework::LoadCaptureDevices > AUTO Added Default capture device: " + newdev.GetName() + " label:" + newdev.GetLabelName() );
1399  }
1400  }
1401  if (!va)
1402  goto finish;
1403  for(guint i=0; i < va->n_values; ++i) {
1404  GValue* v = g_value_array_get_nth(va, i);
1405  //GArray* v = g_array_index(va, i);
1406  GString* stv = g_string_new( g_value_get_string(v) );
1407  if (stv) {
1408  moText cap_dev_name = moText((char*)stv->str);
1409  moCaptureDevice newdev;
1410  newdev.Present(true);
1411 
1412  newdev.SetName(cap_dev_name);
1413  newdev.SetLabelName("LIVEIN"+IntToStr(m_CaptureDevices.Count()));
1414 
1415  m_CaptureDevices.Add( newdev );
1416  MODebug2->Message( "moGsFramework::LoadCaptureDevices > AUTO Added" );
1417  }
1418  //list = g_list_append(list, );
1419  }
1420  g_value_array_free(va);
1421 
1422  finish:
1423  {
1424  gst_element_set_state (device, GST_STATE_NULL);
1425  gst_object_unref(GST_OBJECT (device));
1426  }
1427  }
1428  catch(...) {
1429  MODebug2->Error("moGsFramework::LoadCaptureDevices > exception error.");
1430  }
1431 #else
1432 #if (GST_VERSION_MINOR >= 8)
1433  GstDeviceMonitor *monitor = NULL;
1434  GList *devices = NULL;
1435  GstStructure* properties = NULL;
1436 
1437  monitor = gst_device_monitor_new();
1438  GstCaps *mon_caps = gst_caps_new_empty_simple ("video/x-raw");
1439  gst_device_monitor_add_filter (monitor, "Video/Source", mon_caps);
1440  gst_caps_unref (mon_caps);
1441 
1442  if (!gst_device_monitor_start (monitor))
1443  g_error ("Failed to start device monitor!");
1444 
1445  devices = gst_device_monitor_get_devices (monitor);
1446  int idev = 0;
1447  if (devices != NULL) {
1448  while (devices != NULL) {
1449  GstDevice *device = (GstDevice*)devices->data;
1450 
1451  gchar *device_class, *caps_str, *name,*device_path;
1452  GstCaps *caps;
1453  guint i, size = 0;
1454 
1455  caps = gst_device_get_caps (device);
1456  if (caps != NULL)
1457  size = gst_caps_get_size (caps);
1458 
1459  name = gst_device_get_display_name (device);
1460  properties = gst_device_get_properties(device);
1461  if (properties) {
1462 // gchar *propstr = gst_structure_to_string(properties);
1463  device_path = gst_structure_get_string(properties,"device.path");
1464 // if (propstr) {
1465 // MODebug2->Message( moText("moGsFramework::LoadCaptureDevice > properties: ") + moText(propstr));
1466 // }
1467  }
1468 
1469  MODebug2->Message( moText("moGsFramework::LoadCaptureDevice > name: ") + moText(name)+ " path:"+moText(device_path));
1470 
1471  device_class = gst_device_get_device_class (device);
1472 
1473  for (i = 0; i < size; ++i) {
1474  GstStructure *s = gst_caps_get_structure (caps, i);
1475  caps_str = gst_structure_to_string (s);
1476  //g_print ("\t%s %s\n", (i == 0) ? "caps :" : " ", caps_str);
1477  MODebug2->Message( moText("moGsFramework::LoadCaptureDevice > ")+ moText(" caps: ") + moText(caps_str) + " device_class:" + moText(device_class) );
1478  g_free (caps_str);
1479  }
1480 
1481  moText cap_dev_name = name;
1482 
1483  if (m_CaptureDevices.Count()>idev) {
1484 
1485  moCaptureDevice upddev = m_CaptureDevices.Get(idev);
1486 
1487  if (idev>0) upddev.SetName(cap_dev_name);
1488  upddev.m_Path = device_path;
1489  upddev.Present(true);
1490  upddev.SetLabelName( "LIVEIN" + IntToStr(idev) );
1491  m_CaptureDevices.Set(idev,upddev);
1492 
1493  } else {
1494 
1495  moCaptureDevice newdev;
1496 
1497  newdev.Present(true);
1498  newdev.SetName(cap_dev_name);
1499  newdev.m_Path = device_path;
1500  newdev.SetLabelName( "LIVEIN" + IntToStr(idev) );
1501 
1502  m_CaptureDevices.Add( newdev );
1503  }
1504 
1505 
1506 
1507  //device_added (device);
1508  gst_object_unref (device);
1509  devices = g_list_remove_link (devices, devices);
1510  idev++;
1511  }
1512  } else {
1513  g_print ("No devices found!\n");
1514  }
1515 #endif // GST_VERSION_MINOR
1516 #endif
1517 
1519 
1520  return &m_CaptureDevices;
1521 
1522 }
1523 
1524 
1525 
1526 moCaptureDevices* moGsFramework::UpdateCaptureDevices() {
1527 
1528 
1529  return &m_CaptureDevices;
1530 
1531 }
1532 
1533 
1534 bool
1537  i = 0;
1538 
1539  return false;
1540 
1541 }
1542 
1543 bool
1546 
1547  for(int i=0; i<(int)m_CaptureDevices.Count(); i++) {
1548  if ( m_CaptureDevices[i].GetName() == p_capdev.GetName() ) {
1549  return false;
1550  }
1551  }
1552 
1553  m_CaptureDevices.Add( p_capdev );
1554 
1555  MODebug2->Message( moText("moGsFramework::AddCaptureDevice > Added capture device:") + p_capdev.GetName() + " label:" + p_capdev.GetLabelName() );
1556 
1557  return true;
1558 }
1559 
1560 //===========================================
1561 //
1562 // Class: moGsGraph
1563 //
1564 //===========================================
1565 
1567 
1568  m_pGstBus = NULL;
1569  m_pGMainLoop = NULL;
1570  m_pGMainContext = NULL;
1571  m_pGstPipeline = NULL;
1572  m_pGsFramework = NULL;
1573 
1574  m_pFileSource = NULL;
1575  m_pFinalSource = NULL;
1576  m_pFileSink = NULL;
1577  m_pRTSPSource = NULL;
1578  m_pRTSPDepay = NULL;
1579  m_pHTTPSource = NULL;
1580  m_pMultipartDemux = NULL;
1581  m_pJpegDecode = NULL;
1582  m_pDecoderBin = NULL;
1583  m_pEncoder = NULL;
1584 
1585  m_pTypeFind = NULL;
1586  m_pCapsFilter = NULL;
1587  m_pFakeSink = NULL;
1588  m_pFakeSource = NULL;
1589  m_pIdentity = NULL;
1590 
1591  m_pBucketsPool = NULL;
1592  m_pVideoScale = NULL;
1593  m_pVideoFlip = NULL;
1594  m_pVideoBalance = NULL;
1595 
1596  m_pVideoDeinterlace = NULL;
1597  m_pColorSpaceInterlace = NULL;
1598  m_pColorSpace = NULL;
1599 
1600  m_pAudioConverter = NULL;
1601  m_pAudioConverter2 = NULL;
1602  m_pAudioConverter3 = NULL;
1603  m_pAudioConverter4 = NULL;
1604  m_pAudioEcho = NULL;
1605  m_pAudioPanorama = NULL;
1606  m_pAudioAmplify = NULL;
1607  m_pAudioSpeed = NULL;
1608  m_pAudioVolume = NULL;
1609  m_pAudioSink = NULL;
1610 
1611  m_pAudioPad = NULL;
1612  m_pVideoPad = NULL;
1613 
1614  signal_newpad_id = 0;
1615  signal_handoff_id = 0;
1617  m_bEOS = false;
1618 
1619 }
1620 
1622  //last try to release objects
1623  FinishGraph();
1624 }
1625 
1626 
1627 /*
1628  GstElementFactory *factory;
1629  GstElement * element;
1630 
1631  // init GStreamer
1632  gst_init (&argc, &argv);
1633 
1634  // create element, method #2
1635  factory = gst_element_factory_find ("fakesrc");
1636  if (!factory) {
1637  g_print ("Failed to find factory of type 'fakesrc'\n");
1638  return -1;
1639  }
1640  element = gst_element_factory_create (factory, "source");
1641  if (!element) {
1642  g_print ("Failed to create element, even though its factory exists!\n");
1643  return -1;
1644  }
1645 
1646  gst_object_unref (GST_OBJECT (element));
1647 
1648  return 0;
1649 
1650 */
1651 
1652  //INIT METHODS
1653 bool
1655 
1656  signal_newpad_id = 0;
1657  signal_handoff_id = 0;
1659  m_BusWatchId = 0;
1660  m_bEOS = false;
1661 
1662  //opner en el main de la consola...
1663  //inicialización de la libreria gstreamer
1664  //guint major, minor, micro, nano;
1665  //GError *errores;
1666 
1667  MODebug2->Message( moText("Initializing GStreamer"));
1668  //bool init_result = gst_init_check (NULL, NULL, &errores);
1669 
1670  //gst_init(NULL, NULL);
1671  //init_result = init_result && gst_controller_init(NULL,NULL);
1672 
1673  //gst_version (&major, &minor, &micro, &nano);
1674  //MODebug2->Message( moText("GStreamer version") + IntToStr(major) + moText(".") + IntToStr(minor) + moText(".") + IntToStr(minor));
1675  //char vers[10];
1676  //sprintf( vers, "version: %i.%i.%i.%i",major,minor, micro, nano);
1677 
1678  //if (init_result) MODebug2->Push(moText("Initializing GStreamer:OK "));
1679 
1680 //analogo a FilterGraph, con dos parametros para dar de alta el elemento: playbin
1681 //playbin
1682 //player
1683  MODebug2->Message( moText("creating pipeline"));
1684  m_pGstPipeline = gst_pipeline_new ("pipeline");
1685 
1686  //buscar un tipo de filtro: factory = gst_element_factory_find ("fakesrc");
1687  //generarlo: gst_element_factory_make ( factory, "player");
1688  //o gst_element_factory_make ("playbin", "player");
1689  //tomar el valor de una propiedad: g_object_get (G_OBJECT (element), "name", &name, NULL);
1690 
1691  MODebug2->Message( moText("creating bus interface"));
1692  m_pGstBus = gst_pipeline_get_bus (GST_PIPELINE (m_pGstPipeline));
1693  m_BusWatchId = gst_bus_add_watch ( (GstBus*)m_pGstBus, bus_call, this );
1694  gst_object_unref (m_pGstBus);
1695  m_pGstBus = NULL;
1696 
1697 /*
1698  GMainLoop *loop = g_main_loop_new( NULL, FALSE);
1699  m_pGMainLoop = (moGMainLoop*) loop;
1700  if (loop) {
1701  m_pGMainContext = (moGMainContext*) g_main_loop_get_context( loop );
1702  }
1703  */
1704  m_pGMainContext = (moGMainContext*) g_main_context_default();
1705  //fin inicialización
1706 
1707 /*
1708  m_pGstPipeline = gst_element_factory_make ("playbin", "play");
1709  g_object_set (G_OBJECT (m_pGstPipeline), "uri", "file:///home/fabri/plasma.mpg", NULL);
1710 
1711  m_pGstBus = gst_pipeline_get_bus (GST_PIPELINE (m_pGstPipeline));
1712  gst_bus_add_watch (m_pGstBus, bus_call, loop);
1713  gst_object_unref (m_pGstBus);
1714 
1715 
1716  CheckState( gst_element_set_state (m_pGstPipeline, GST_STATE_PAUSED), true );
1717 */
1718  /* now run */
1719 
1720  //g_main_loop_run (moGsGraph::loop);
1721  MODebug2->Message( moText("moGsGraph::Init result:") + moText(((m_pGstPipeline!=NULL) ? "success" : "failure")) );
1723  return (m_bInitialized);
1724 }
1725 
1726 
1727 bool
1729 
1730  if (IsRunning()) {
1731  Stop();
1732  }
1733 
1734  if (m_BusWatchId!=0) {
1735  if (!g_source_remove(m_BusWatchId)) {
1736  MODebug2->Error(moText("Error releasing bus call watch:") + IntToStr(m_BusWatchId));
1737  } else m_BusWatchId = 0;
1738  }
1739 
1740 
1741  if (m_pGMainLoop) {
1742 
1743  g_main_loop_quit( (GMainLoop*) m_pGMainLoop );
1744  g_main_loop_unref( (GMainLoop*) m_pGMainLoop);
1745 
1746  m_pGMainLoop = NULL;
1747  m_pGMainContext = NULL;
1748  }
1749 
1750  if (m_pColorSpace) {
1751 #ifndef GSTVERSION
1752  GstPad* srcRGB = gst_element_get_pad ( (GstElement*)m_pColorSpace, "src");
1753  if (srcRGB && cb_have_data_handler_id) gst_pad_remove_buffer_probe ( srcRGB, cb_have_data_handler_id );
1754 #endif
1756  }
1757 
1758  if (m_pColorSpaceInterlace) {
1759 #ifndef GSTVERSION
1760  GstPad* srcRGB = gst_element_get_pad ( (GstElement*)m_pColorSpaceInterlace, "src");
1761  if (srcRGB && cb_have_data_handler_id) gst_pad_remove_buffer_probe ( srcRGB, cb_have_data_handler_id );
1762 #endif
1764  }
1765 
1766 
1767  if (m_pFileSource) {
1768  //gst_object_unref( (GstElement*) m_pFileSource);
1769  m_pFileSource = NULL;
1770  }
1771 
1772  if (m_pJpegDecode) {
1773  //gst_object_unref( (GstElement*) m_pJpegDecode);
1774  m_pJpegDecode = NULL;
1775  }
1776 
1777  if (m_pMultipartDemux) {
1778  //gst_object_unref( (GstElement*) m_pMultipartDemux);
1779  m_pMultipartDemux = NULL;
1780  }
1781 
1782  if (m_pHTTPSource) {
1783  //gst_object_unref( (GstElement*) m_pHTTPSource);
1784  m_pHTTPSource = NULL;
1785  }
1786 
1787  if (m_pRTSPDepay) {
1788  //gst_object_unref( (GstElement*) m_pRTSPDepay);
1789  m_pRTSPDepay = NULL;
1790  }
1791 
1792  if (m_pRTSPSource) {
1793  //gst_object_unref( (GstElement*) m_pRTSPSource);
1794  m_pRTSPSource = NULL;
1795  }
1796 
1798  if (m_pFinalSource) {
1799  m_pFinalSource = NULL;
1800  }
1801 
1802  if (m_pColorSpace) {
1803  //gst_object_unref( (GstElement*) m_pColorSpace);
1804  m_pColorSpace = NULL;
1805  }
1806 
1807  if (m_pColorSpaceInterlace) {
1808  //gst_object_unref( (GstElement*) m_pColorSpaceInterlace);
1809  m_pColorSpaceInterlace = NULL;
1810  }
1811 
1812  if (m_pCapsFilter) {
1813  //gst_object_unref( (GstElement*) m_pCapsFilter);
1814  m_pCapsFilter = NULL;
1815  }
1816 
1817  if (m_pDecoderBin) {
1818  if (g_signal_handler_is_connected((GstElement*)m_pDecoderBin, signal_newpad_id))
1819  g_signal_handler_disconnect ( (GstElement*)m_pDecoderBin, signal_newpad_id );
1820  signal_newpad_id = 0;
1821  //gst_object_unref( (GstElement*) m_pDecoderBin);
1822  m_pDecoderBin = NULL;
1823  }
1824 
1825  if (m_pFakeSink) {
1826  //gst_object_unref( (GstElement*) m_pFakeSink);
1827  m_pFakeSink = NULL;
1828  }
1829 
1830  if (m_pAudioConverter) {
1831  //gst_object_unref( (GstElement*) m_pAudioConverter);
1832  m_pAudioConverter = NULL;
1833  }
1834 
1835  if (m_pAudioSink) {
1836  //gst_object_unref( (GstElement*) m_pAudioSink);
1837  m_pAudioSink = NULL;
1838  }
1839 
1840  if (m_pAudioPad) {
1841  //gst_object_unref( (GstPad*) m_pAudioPad);
1842  m_pAudioPad = NULL;
1843  }
1844 
1845  if (m_pVideoPad) {
1846  //gst_object_unref( (GstPad*) m_pVideoPad);
1847  m_pVideoPad = NULL;
1848  }
1849 
1850  if (m_pFakeSource) {
1851  if (g_signal_handler_is_connected((GstElement*)m_pFakeSource, signal_handoff_id))
1852  g_signal_handler_disconnect ( (GstElement*)m_pFakeSource, signal_handoff_id );
1853  signal_handoff_id = 0;
1854  //gst_object_unref( (GstElement*) m_pFakeSource);
1855  m_pFakeSource = NULL;
1856  }
1857 
1858  if (m_pFileSink) {
1859  //gst_object_unref( (GstElement*) m_pFileSink);
1860  m_pFileSink = NULL;
1861  }
1862 
1863  if (m_pGstBus) {
1864  //gst_object_unref( (GstElement*) m_pGstBus);
1865  m_pGstBus = NULL;
1866  }
1867 
1868  if (m_pVideoDeinterlace) {
1869  gst_object_unref( (GstElement*) m_pVideoDeinterlace);
1870  m_pVideoDeinterlace = NULL;
1871  }
1872 
1873  if (m_pVideoScale) {
1874  //gst_object_unref( (GstElement*) m_pVideoScale);
1875  m_pVideoScale = NULL;
1876  }
1877 
1878  if (m_pVideoFlip) {
1879  //gst_object_unref( (GstElement*) m_pVideoScale);
1880  m_pVideoFlip = NULL;
1881  }
1882 
1884  if (m_pGstPipeline) {
1885  gst_object_unref( (GstElement*) m_pGstPipeline);
1886  m_pGstPipeline = NULL;
1887  }
1888 
1889 
1890  return false;
1891 }
1892 
1893 bool
1894 moGsGraph::IsEOS() {
1895  return m_bEOS;
1896 }
1897 
1898 void
1899 moGsGraph::SetEOS(bool iseos) {
1900  m_bEOS = iseos;
1901 }
1902 
1903 
1904 
1905 //FILTER METHODS
1906 bool
1907 moGsGraph::SetCaptureDevice( moText deviceport, MOint idevice) {
1908 
1910  deviceport = "";
1911  idevice = 0;
1912  return false;
1913 }
1914 
1915 
1916 void
1917 moGsGraph::CopyVideoFrame( void* bufferdst, int size ) {
1918 
1919  //int ttid = m_pDirectorCore->GetResourceManager()->GetTextureMan()->GetTextureMOId( moText("preview_texture"), false);
1920  if (m_pBucketsPool) {
1921  moBucket* pBucket = m_pBucketsPool->RetreiveBucket();
1922 
1923  if (pBucket) {
1924  void* pbuf = pBucket->GetBuffer();
1925 
1926  pBucket->Lock();
1927  memcpy( bufferdst, (void*)pbuf, size );
1928  pBucket->Unlock();
1930 
1931  }
1932 
1933  }
1934 
1935 }
1936 
1947 bool
1948 moGsGraph::BuildLiveGraph( moBucketsPool *pBucketsPool, moCaptureDevice p_capdev) {
1949 
1950  return BuildLiveWebcamGraph( pBucketsPool, p_capdev );
1951 }
1952 
1967 bool
1968 moGsGraph::BuildLiveStreamingGraph( moBucketsPool *pBucketsPool, moText p_location ) {
1970  p_location = "";
1971  pBucketsPool = NULL;
1972  return false;
1973 }
1974 
1980 bool
1981 moGsGraph::BuildRecordGraph( moText filename, moBucketsPool *pBucketsPool ) {
1983  m_pBucketsPool = pBucketsPool;
1984  bool link_result = false;
1985  /*
1986  bool b_sourceselect = false;
1987  bool b_forcevideoscale = false;
1988  bool b_forcevideoflip = false;
1989  */
1990  //gchar* checkval;
1991  bool res = false;
1992 
1993 
1994  if (filename.Length()>0)
1995  {
1996 
1997  m_pFakeSource = gst_element_factory_make ("fakesrc", "source");
1998 
1999  /* setup fake source */
2000  if (m_pFakeSource) {
2001  g_object_set (G_OBJECT (m_pFakeSource),
2002  "signal-handoffs", TRUE,
2003  "sizemax", 400 * 300 * 3,
2004  "silent", TRUE,
2005  "sync", TRUE,
2006  "num-buffers", 30*200,
2007  "sizetype", 2, NULL);
2008  #ifndef GSTVERSION
2009  signal_handoff_id = g_signal_connect (m_pFakeSource, "handoff", G_CALLBACK (cb_handoff), this);
2010  #endif
2011 
2012  res = gst_bin_add (GST_BIN (m_pGstPipeline), (GstElement*) m_pFakeSource );
2013  }
2014 
2015 
2016  m_pCapsFilter = gst_element_factory_make ("capsfilter", "filtsource");
2017  if (m_pCapsFilter) {
2018  g_object_set (G_OBJECT (m_pCapsFilter), "caps", gst_caps_new_simple ("video/x-raw-rgb",
2019  "width", G_TYPE_INT, 400,
2020  "height", G_TYPE_INT, 300,
2021  "framerate", GST_TYPE_FRACTION, 10, 1,
2022  "bpp", G_TYPE_INT, 24,
2023  "depth", G_TYPE_INT, 24,
2024  "red_mask",G_TYPE_INT, 255,
2025  "green_mask",G_TYPE_INT, 65280,
2026  "blue_mask",G_TYPE_INT, 16711680,
2027  "endianness", G_TYPE_INT, 4321,
2028  NULL), NULL);
2029  //depth=(int)24, red_mask=(int)16711680, green_mask=(int)65280, blue_mask=(int)255, endianness=(int)4321
2030  res = gst_bin_add (GST_BIN (m_pGstPipeline), (GstElement*) m_pCapsFilter );
2031  }
2032 
2033 
2034  m_pColorSpace = gst_element_factory_make (VIDEOCONVERT, "color");
2035  if (m_pColorSpace) {
2036  res = gst_bin_add (GST_BIN (m_pGstPipeline), (GstElement*) m_pColorSpace );
2037  }
2038 
2039  link_result = gst_element_link_many( (GstElement*) m_pFakeSource, (GstElement*) m_pCapsFilter, (GstElement*) m_pColorSpace, NULL );
2040 
2041  if (link_result) {
2042 
2043  m_pEncoder = gst_element_factory_make( "ffenc_mpeg1video", "encoder");
2044  if (m_pEncoder) {
2045  res = gst_bin_add (GST_BIN (m_pGstPipeline), (GstElement*) m_pEncoder );
2046  }
2047 
2048  m_pMultiplexer = gst_element_factory_make( "ffmux_mpeg", "multiplexer");
2049  if (m_pMultiplexer) {
2050  res = gst_bin_add (GST_BIN (m_pGstPipeline), (GstElement*) m_pMultiplexer );
2051  }
2052 
2053  m_pFileSink = gst_element_factory_make( "filesink", "filesink");
2054  if (m_pFileSink) {
2055  g_object_set (G_OBJECT (m_pFileSink), "location", (char*)filename, NULL);
2056  res = gst_bin_add (GST_BIN (m_pGstPipeline), (GstElement*) m_pFileSink );
2057  }
2058 
2059  link_result = gst_element_link_many( (GstElement*) m_pColorSpace, (GstElement*) m_pEncoder, (GstElement*) m_pMultiplexer, (GstElement*) m_pFileSink, NULL );
2060  //link_result = gst_element_link_many( (GstElement*) m_pColorSpace, (GstElement*) m_pEncoder, NULL );
2061  //link_result = gst_element_link_many( (GstElement*) m_pColorSpace, (GstElement*) m_pEncoder, (GstElement*) m_pMultiplexer, NULL );
2062 
2063  if (link_result) {
2064  //if (CheckState( gst_element_set_state ((GstElement*) m_pGstPipeline, GST_STATE_PLAYING), false /*SYNCRUNASLI*/ )) {
2065  gst_element_set_state ( (GstElement*) m_pGstPipeline, GST_STATE_PLAYING);
2066 
2067  return true;
2068  //}
2069  }
2070  } else return false;
2071 
2072  }
2073 
2074  return false;
2075 }
2076 
2077 
2078 bool
2079 moGsGraph::BuildLiveDVGraph( moBucketsPool *pBucketsPool, moCaptureDevice &p_capdev ) {
2080 
2082  moCaptureDevice pp = p_capdev;
2083  pBucketsPool = NULL;
2084  return true;
2085 }
2086 
2107 bool
2109 
2110  m_pBucketsPool = pBucketsPool;
2111  GstCaps *caps = NULL;
2112  GstCaps *rsrc_caps = NULL;
2113  bool link_result = false;
2114 
2115  bool b_sourceselect = false;
2116  bool b_forcevideoscale = false;
2117  bool b_forcevideoflip = false;
2118 
2119  bool b_forcevideointerlace = false;
2120 
2121  //gchar* checkval;
2122  bool res = false;
2123  //GstPadLinkReturn ret_padlink;
2124 
2125  moGstElement* m_pColorSpaceSource = NULL;
2126 
2127  moGstElement* m_pCapsFilterSource = NULL;
2128  moGstElement* m_pCapsFilter2 = NULL;
2129 
2130  moText labelname;
2131  moText devicename;
2132  moText devicepath;
2133  MOint p_sourcewidth;
2134  MOint p_sourceheight;
2135  MOint p_sourcebpp;
2136  MOint p_forcewidth;
2137  MOint p_forceheight;
2138  MOint p_forceflipH;
2139  MOint p_forceflipV;
2140  moText colormode;
2141  moText devinfo;
2142 
2143  labelname = p_capdev.GetLabelName();
2144 
2145  devicename = p_capdev.GetName();
2146  devicepath = p_capdev.GetPath();
2147  switch( p_capdev.GetVideoFormat().m_ColorMode) {
2148  case YUV:
2149  colormode = moText("video/x-raw-yuv");
2150  break;
2151  case RGB:
2152  colormode = moText("video/x-raw-rgb");
2153  break;
2154  default:
2155  colormode = "";
2156  break;
2157  };
2158  p_sourcewidth = p_capdev.GetSourceWidth();
2159  p_sourceheight = p_capdev.GetSourceHeight();
2160  p_sourcebpp = p_capdev.GetSourceBpp();
2161 
2162  p_forcewidth = p_capdev.GetVideoFormat().m_Width;
2163  p_forceheight = p_capdev.GetVideoFormat().m_Height;
2164  p_forceflipH = p_capdev.GetSourceFlipH();
2165  p_forceflipV = p_capdev.GetSourceFlipV();
2166 
2167  if (p_forcewidth!=0 || p_forceheight!=0) {
2168  b_forcevideoscale = true;
2169  }
2170 
2171  if (p_forceflipH!=0 || p_forceflipV!=0) {
2172  b_forcevideoflip = true;
2173  }
2174 
2175  if (p_sourcewidth!=0 || p_sourceheight!=0) {
2176  b_sourceselect = true;
2177  }
2178 
2179  devinfo = moText("Label/Texture ") + labelname;
2180  devinfo+= moText("; DeviceName ") + devicename;
2181  devinfo+= moText("; DevicePath ") + devicepath;
2182  devinfo+= moText("; colormode ") + colormode;
2183  devinfo+= moText("; width ") + IntToStr(p_sourcewidth);
2184  devinfo+= moText("; height ") + IntToStr(p_sourceheight);
2185  devinfo+= moText("; flipH ") + IntToStr(p_forceflipH);
2186  devinfo+= moText("; flipV ") + IntToStr(p_forceflipV);
2187  if (devicename.Length()>0)
2188  {
2189 
2190  std::string dname;
2191 
2192  dname = devicename;
2193 
2194  //if (labelname==moText("RTSP")) {
2195  int rtspindex = -1;
2196  int httpindex = -1;
2197  rtspindex = dname.find("rtsp");
2198  httpindex = dname.find("http");
2199  //devicename.Find("http")==0
2200  //devicename.Find("rtsp");
2201  if ( labelname==moText("RTSP") || rtspindex == 0 ) {
2202 
2203  m_pRTSPSource = gst_element_factory_make ("rtspsrc", "source");
2204  m_pRTSPDepay = gst_element_factory_make ("rtph264depay", "depay");
2205  m_pMultipartDemux = gst_element_factory_make ("h264parse", "parse");
2206 
2208  m_pRTSPDepaySink = gst_element_get_static_pad ( (GstElement*)m_pRTSPDepay, "sink" );
2209  signal_rtsppad_added_id = g_signal_connect (m_pRTSPSource, "pad-added", G_CALLBACK (cb_pad_added_new), (gpointer)this);
2210 #ifndef GSTVERSION
2211 signal_rtsppad_added_id = g_signal_connect (m_pRTSPSource, "pad-added", G_CALLBACK (on_rtsppadd_added), (gpointer)this);
2212 #endif
2213  }
2214  } else if (labelname==moText("HTTP") || httpindex==0 ) {
2215  m_pHTTPSource = gst_element_factory_make ("souphttpsrc", "source");
2216  //needed for decodebin2 TODO: check this in gstreamer 1.0
2217  //m_pMultipartDemux = gst_element_factory_make ("multipartdemux", "demux");
2218  if ( m_pHTTPSource && m_pMultipartDemux ) {
2219  //signal_rtsppad_added_id = g_signal_connect ( m_pMultipartDemux, "pad-added", G_CALLBACK (on_rtsppadd_added), (gpointer)this);
2220  }
2221  }
2222  else {
2223 
2224  #ifdef MO_WIN32
2225  #ifdef GSTVERSION
2226  m_pFileSource = gst_element_factory_make ("ksvideosrc", "source");
2227  #else
2228  m_pFileSource = gst_element_factory_make ("dshowvideosrc", "source");
2229  #endif
2230  #else
2231  #ifdef GSTVERSION
2232  #ifdef MO_MACOSX
2233  m_pFileSource = gst_element_factory_make ("wrappercamerabinsrc", "source");
2234  cout << "wrappercamerabinsrc created!" << endl;
2235  #else
2236  if (devicename==moText("DV"))
2237  m_pFileSource = gst_element_factory_make ("dv1394src", "source");
2238  else {
2239  //m_pFileSource = gst_element_factory_make ("rpicamsrc", "source");
2240  //g_object_set (G_OBJECT (m_pFileSource), "preview", (bool)false, NULL);
2241  //g_object_set (G_OBJECT (m_pFileSource), "sensor-mode", (int)6, NULL);
2242  m_pFileSource = gst_element_factory_make ("v4l2src", "source");
2243  }
2244  #endif
2245 
2246  #else
2247  if (devicename==moText("DV"))
2248  m_pFileSource = gst_element_factory_make ("dv1394src", "source");
2249  else
2250  m_pFileSource = gst_element_factory_make ("v4l2src", "source");
2251  #endif
2252  #endif
2253 
2255  }
2256 
2258  if (devicename.Length() > 0 && ( devicename!=moText("default")) ) {
2259  g_object_set (G_OBJECT (m_pRTSPSource), "location", (char*)devicename, NULL);
2260  g_object_set (G_OBJECT (m_pRTSPSource), "latency", (guint) 0, NULL);
2261  g_object_set (G_OBJECT (m_pRTSPSource), "debug", (gboolean) false, NULL);
2262 
2263  //g_object_set (G_OBJECT (m_pRTSPSource), "protocols", (guint) 0x00000004, NULL);
2264  // caps = "application/x-rtp\,\ media\=\(string\)video\,\ payload\=\(int\)96\,
2265  //\ clock-rate\=\(int\)90000\,\ encoding-name\=\(string\)H264\,\ packetization-mode\=\(string\)1\,\ profile-level-id\=\(string\)4d001e\,\ sprop-parameter-sets\=\(string\)\"Z00AHp2oKAv+WbgICAoAAAMAAgAAAwAfCA\\\=\\\=\\\,aO48gA\\\=\\\=\""
2266  /*rsrc_caps = gst_caps_new_simple ( "application/x-rtp","media", G_TYPE_STRING, "video",
2267  "clock-rate", G_TYPE_INT, 90000,
2268  "encoding-name", G_TYPE_STRING, "H264",
2269  "payload", G_TYPE_INT, 96, NULL);*/
2270  //g_object_set (G_OBJECT (m_pRTSPSource), "caps", rsrc_caps, NULL);
2271 
2272  }
2273  res = gst_bin_add (GST_BIN (m_pGstPipeline), (GstElement*) m_pRTSPSource );
2274  if (res) {
2276  res = gst_bin_add (GST_BIN (m_pGstPipeline), (GstElement*) m_pRTSPDepay );
2277  if (res) {
2279  res = gst_bin_add (GST_BIN (m_pGstPipeline), (GstElement*) m_pMultipartDemux );
2280  /*if (caps!=NULL and m_pRTSPSource) {
2281  m_pCapsFilterSource = gst_element_factory_make ("capsfilter", "filtsource");
2282  if (m_pCapsFilterSource) {
2283  res = gst_bin_add ( GST_BIN (m_pGstPipeline), (GstElement*) m_pCapsFilterSource);
2284  g_object_set (G_OBJECT (m_pCapsFilterSource), "caps", caps, NULL);
2285 
2286  }
2287  }*/
2288  link_result = gst_element_link_many( m_pRTSPDepay, m_pMultipartDemux, NULL );
2289  //(GstElement*) m_pRTSPSource,
2290  // (GstElement*) m_pCapsFilterSource,
2291  // (GstElement*) m_pRTSPDepay,
2292  //NULL );
2294  //link_result = true;
2295 
2296 
2297  }
2298 
2299  }
2300 
2301  if (link_result) {
2302  //m_pFinalSource = m_pRTSPDepay;
2304  } else {
2305  m_pFinalSource = NULL;
2306  }
2307 
2308  }
2309 
2311  if ( m_pHTTPSource /*&& m_pMultipartDemux*/ ) {
2312 
2313  g_object_set (G_OBJECT (m_pHTTPSource), "location", (char*)devicename, NULL);
2314  g_object_set (G_OBJECT (m_pHTTPSource), "automatic-redirect", TRUE, NULL);
2315 
2316  //g_object_set (G_OBJECT (m_pRTSPSource), "latency", (guint) 0, NULL);
2317  //g_object_set (G_OBJECT (m_pRTSPSource), "debug", (gboolean) true, NULL);
2318  //g_object_set (G_OBJECT (m_pRTSPSource), "protocols", (guint) 0x00000004, NULL);
2319 
2320 
2321  res = gst_bin_add (GST_BIN (m_pGstPipeline), (GstElement*) m_pHTTPSource );
2322  //needed for decoderbin2 (version 2) not for version 1
2323  //res = res && gst_bin_add (GST_BIN (m_pGstPipeline), (GstElement*) m_pMultipartDemux );
2324  link_result = false;
2325  if (res) {
2326  //link_result = gst_element_link_many( (GstElement*) m_pHTTPSource, (GstElement*) m_pMultipartDemux, NULL );
2327  link_result = true;
2328  }
2329 
2330  if ( link_result ) {
2331  //m_pFinalSource = m_pMultipartDemux;
2333  m_pDecoderBin = gst_element_factory_make ( "decodebin", "decoder");
2334  } else {
2335  MODebug2->Error(moText("moGsGraph::BuildLiveWebcamGraph > SOUP HTTP source failed linking with MultipartDemux"));
2336  m_pFinalSource = NULL;
2337  }
2338 
2339  }
2340 
2341 
2343  if (m_pFileSource) {
2344  #ifdef MO_WIN32
2345  devicename.ToLower();
2346  if (devicename.Length() > 0 && ( devicename!=moText("default")) ) {
2347  g_object_set (G_OBJECT (m_pFileSource), "device-name", (char*)devicename, NULL);
2348  }
2349  #else
2350  if (devicename==moText("DV") ) {
2351  g_object_set (G_OBJECT (m_pFileSource), "port", 0, NULL);
2352  } else {
2353  devicename.ToLower();
2354  if ( devicename.Length() > 0 && ( devicename!=moText("default") ) ) {
2355  if (dname.find( "/dev/" )==0 ) {
2356  g_object_set (G_OBJECT (m_pFileSource), "device", (char*)devicename, NULL);
2357  } else if (devicepath.Find("/dev/" )==0) {
2358  g_object_set (G_OBJECT (m_pFileSource), "device", (char*)devicepath, NULL);
2359  } else {
2360  g_object_set (G_OBJECT (m_pFileSource), "device-name", (char*)devicename, NULL);
2361  }
2362  }
2363  }
2364  #endif
2365 
2366  res = gst_bin_add (GST_BIN (m_pGstPipeline), (GstElement*) m_pFileSource );
2367  MODebug2->Message( moText("filesrc created! > devicename: ") + (moText)devicename );
2369  }
2370 
2371  if (m_pFinalSource) {
2372  //g_object_get (G_OBJECT (m_pFileSource), "location", &checkval, NULL);
2373  //GstElement *filter = gst_element_factory_make ("capsfilter", "filter");
2374  //g_object_set (G_OBJECT (m_pFileSource), "pattern", GST_VIDEO_TEST_SRC_SNOW, NULL);
2375  //res = gst_pad_set_caps( gst_element_get_pad( m_pFileSource, "src" ), NULL);
2376 
2377  GstIterator* iterator = NULL;
2378  iterator = gst_element_iterate_src_pads( (GstElement*) m_pFinalSource );
2379 
2380 #ifndef GSTVERSION
2381  gpointer item;
2382 #else
2383  GValue item = G_VALUE_INIT;
2384 #endif
2385  GstPad* srcpad = NULL;
2386  GstCaps* itemcaps = NULL;
2387  GstCaps* capstpl = NULL;
2388  GstCaps* capsQuery = NULL;
2389  GstPad* peerPad = NULL;
2390 
2391 
2392  //GstPad* sinkpad = NULL;
2393 
2394  moText padname;
2395  moText icapsstr;
2396 
2397  bool done = FALSE;
2398  while (!done) {
2399  #ifndef GSTVERSION
2400  switch (gst_iterator_next (iterator, &item)) {
2401  #else
2402  switch (gst_iterator_next (iterator, &item)) {
2403  #endif
2404  case GST_ITERATOR_OK:
2405  //... use/change item here...
2406  #ifndef GSTVERSION
2407  srcpad = (GstPad*)item;
2408  #else
2409  srcpad = (GstPad*)g_value_dup_object (&item);
2410  #endif
2411  padname = gst_object_get_name((GstObject*) srcpad );
2412 
2413  MODebug2->Message( moText("filesrc src pad: checking caps: ") + (moText)padname );
2414 
2415  #ifndef GSTVERSION
2416  itemcaps = gst_pad_get_caps( srcpad );
2417  #else
2418  itemcaps = gst_pad_get_current_caps( srcpad );
2419  capstpl = gst_pad_get_pad_template_caps( srcpad );
2420  capsQuery = gst_pad_query_caps( srcpad, NULL );
2421  peerPad = gst_pad_get_peer( srcpad );
2422  //if (peerPad==NULL)
2423 
2424  //gst_pad_peer_query_caps()
2425  #endif
2426 
2427  if (capsQuery) {
2428 
2429  icapsstr = moText( gst_caps_to_string(capsQuery) );
2430  MODebug2->Message(icapsstr);
2431  }
2432  //gst_object_unref (item);
2433 #ifdef GSTVERSION
2434  g_value_reset (&item);
2435 #endif
2436  break;
2437  case GST_ITERATOR_RESYNC:
2438  //...rollback changes to items...
2439  gst_iterator_resync (iterator);
2440  break;
2441  case GST_ITERATOR_ERROR:
2442  //...wrong parameters were given...
2443  done = TRUE;
2444  break;
2445  case GST_ITERATOR_DONE:
2446  done = TRUE;
2447  break;
2448  }
2449  }
2450 
2451  gst_iterator_free (iterator);
2452 
2453  //queue = gst_element_factory_make("queue", "vqueue");
2454 //b_sourceselect = true;
2455 //colormode = "";
2456 
2457  if (b_sourceselect) {
2458  //#ifdef MO_WIN32
2459  #ifdef GSTVERSION
2460  //b_sourceselect = false;
2461  #endif // GSTVERSION
2462  //#endif // WIN32
2463  }
2464 
2465  if (b_sourceselect) {
2466  MODebug2->Message(moText("moGsGraph:: sourceselect: colormode: ") + (moText)colormode
2467  + moText(" wXh: ") + IntToStr(p_sourcewidth)
2468  + moText("X") + IntToStr(p_sourceheight)
2469  + moText(" bpp:") + IntToStr(p_sourcebpp));
2470  m_pCapsFilterSource = gst_element_factory_make ("capsfilter", "filtsource");
2471 
2472  if (m_pCapsFilterSource) {
2478 #ifndef GSTVERSION
2479  MODebug2->Message("colormode: "+ colormode );
2480  if (colormode=="") colormode = "video/x-raw-yuv";
2481  //if (colormode=="") colormode = "video/x-raw-rgb";
2482  g_object_set (G_OBJECT (m_pCapsFilterSource), "caps", gst_caps_new_simple ( colormode,
2483  "width", G_TYPE_INT, p_sourcewidth,
2484  "height", G_TYPE_INT, p_sourceheight,
2485  "depth", G_TYPE_INT, 24,
2486  "red_mask",G_TYPE_INT, 16711680,
2487  "green_mask",G_TYPE_INT, 65280,
2488  "blue_mask",G_TYPE_INT, 255,
2489  NULL), NULL);
2490 #else
2491 //
2492  moText colormodef = "";
2493 
2494  int opt_framerate = 15;
2495  if (colormode=="") {
2496  colormode = "video/x-raw";
2497  /*
2498  colormodef = "BGR";
2499  moText fullf = colormode+ ","+ colormodef;
2500  MODebug2->Message("moGsGraph::BuildLiveWebcamGraph > p_sourcewidth:" + fullf );
2501 
2502  g_object_set (G_OBJECT (m_pCapsFilterSource), "caps", gst_caps_new_simple ( colormode,
2503  "format", G_TYPE_STRING, (char*)colormodef,
2504  "width", G_TYPE_INT, p_sourcewidth,
2505  "height", G_TYPE_INT, p_sourceheight,
2506  "framerate", GST_TYPE_FRACTION, opt_framerate, 1,
2507  NULL), NULL);
2508  */
2509  //colormodef = "UYVY";
2510  colormodef = "RGB";
2511  moText fullf = colormode+ ","+ colormodef;
2512  MODebug2->Message( moText("moGsGraph::BuildLiveWebcamGraph > (colormode, format): (") + fullf + moText(")") );
2513  //opt_framerate = 30;
2514  /*g_object_set (G_OBJECT (m_pCapsFilterSource), "caps", gst_caps_new_simple ( colormode,
2515  "format", G_TYPE_STRING, (char*)colormodef,
2516  "width", G_TYPE_INT, p_sourcewidth,
2517  "height", G_TYPE_INT, p_sourceheight,
2518  "framerate", GST_TYPE_FRACTION, opt_framerate, 1,
2519  NULL), NULL);
2520  */
2521  g_object_set (G_OBJECT (m_pCapsFilterSource), "caps", gst_caps_new_simple ( colormode,
2522  //"format", G_TYPE_STRING, (char*)colormodef,
2523  "width", G_TYPE_INT, p_sourcewidth,
2524  "height", G_TYPE_INT, p_sourceheight,
2525  NULL), NULL);
2526 
2527  } else {
2528 
2529  colormode="video/x-raw-yuv";
2530 
2531  if (colormode=="video/x-raw-rgb") {
2532  colormodef = "RGB";
2533  } else if (colormode=="video/x-raw-yuv") {
2534  colormodef = "YUV";
2535  }
2536 
2537  colormode="video/x-raw";
2538 
2539  g_object_set (G_OBJECT (m_pCapsFilterSource), "caps", gst_caps_new_simple ( colormode,
2540  //"format", G_TYPE_STRING, "I420",
2541  /*"format", G_TYPE_STRING, (char*)colormodef,*/
2542  "width", G_TYPE_INT, p_sourcewidth,
2543  "height", G_TYPE_INT, p_sourceheight,
2544  NULL), NULL);
2545 
2546  }
2547 
2548 
2549 #endif
2550  //depth=(int)24, red_mask=(int)16711680, green_mask=(int)65280, blue_mask=(int)255, endianness=(int)4321
2551  /*
2552  "bpp", G_TYPE_INT, p_sourcebpp,
2553  "depth", G_TYPE_INT, 24,
2554  "red_mask",G_TYPE_INT, 255,
2555  "green_mask",G_TYPE_INT, 65280,
2556  "blue_mask",G_TYPE_INT, 16711680,
2557  "endianness", G_TYPE_INT, 4321
2558  */
2559  res = gst_bin_add (GST_BIN (m_pGstPipeline), (GstElement*) m_pCapsFilterSource );
2560  if (res) { MODebug2->Message(moText("moGsGraph:: added capsfilter source!") ) ; }
2561  else MODebug2->Error(moText("moGsGraph:: adding capsfilter source..."));
2562  }
2563  }
2564 
2565 
2566  if (b_forcevideoflip) {
2592  m_pVideoFlip = gst_element_factory_make ("videoflip", "flip");
2593  if (m_pVideoFlip) {
2594  int method = 0;//identity
2595  if (p_forceflipH==1 && p_forceflipV==1) {
2596  method = 2;
2597  } else if (p_forceflipH==1) {
2598  method = 4;
2599  } else if (p_forceflipV==1) {
2600  method = 5;
2601  }
2602 #ifndef GSTVERSION
2603  g_object_set (G_OBJECT (m_pVideoScale), "method", &method, NULL);
2604 #endif
2605  res = gst_bin_add (GST_BIN (m_pGstPipeline), (GstElement*) m_pVideoFlip );
2606  g_object_set (G_OBJECT (m_pVideoFlip), "method", (int)method, NULL);
2607 
2608 
2609  }
2610  }
2611 
2612 
2613  b_forcevideoscale = false;
2614  if (b_forcevideoscale) {
2615 
2616  m_pVideoScale = gst_element_factory_make ("videoscale", "scale");
2617  if (m_pVideoScale) {
2618  int method = 0;
2619  colormode = "video/x-raw";
2620  MODebug2->Message(moText("moGsGraph:: creating videoscale!") ) ;
2621 #ifndef GSTVERSION
2622  g_object_set (G_OBJECT (m_pVideoScale), "method", &method, NULL);
2623 #endif
2624  res = gst_bin_add (GST_BIN (m_pGstPipeline), (GstElement*) m_pVideoScale );
2625 
2626  m_pCapsFilter2 = gst_element_factory_make ("capsfilter", "filt2");
2627  if (m_pCapsFilter2) {
2628  if (b_forcevideoscale) {
2629  g_object_set (G_OBJECT (m_pCapsFilter2), "caps", gst_caps_new_simple ( colormode,
2630  "width", G_TYPE_INT, p_forcewidth,
2631  "height", G_TYPE_INT, p_forceheight,
2632  NULL), NULL);
2633  } else {
2634  g_object_set (G_OBJECT (m_pCapsFilter2), "caps", gst_caps_new_simple ( colormode,
2635  "width", G_TYPE_INT, 240,
2636  "height", G_TYPE_INT, 160,
2637  NULL), NULL);
2638  }
2639  //depth=(int)24, red_mask=(int)16711680, green_mask=(int)65280, blue_mask=(int)255, endianness=(int)4321
2640  res = gst_bin_add (GST_BIN (m_pGstPipeline), (GstElement*) m_pCapsFilter2 );
2641  }
2642 
2643 
2644  }
2645  }
2646 
2647  b_forcevideointerlace = false;
2648  if (b_forcevideointerlace) {
2649  m_pColorSpaceInterlace = gst_element_factory_make (VIDEOCONVERT, "colordeinterlace");
2650  if (m_pColorSpaceInterlace) {
2651  MODebug2->Message(moText("moGsGraph:: created videoconvert before deinterlace!") ) ;
2652  res = gst_bin_add (GST_BIN (m_pGstPipeline), (GstElement*) m_pColorSpaceInterlace );
2653  }
2654 
2655 
2656  m_pVideoDeinterlace = gst_element_factory_make ("ffdeinterlace", "deinterlace");
2657  if (m_pVideoDeinterlace) {
2658  //int tff = 2;//bottom field first
2659  //g_object_set (G_OBJECT (m_pVideoDeinterlace), "tff", &tff, NULL);
2660  MODebug2->Message(moText("moGsGraph:: created ffdeinterlace!") ) ;
2661  res = gst_bin_add (GST_BIN (m_pGstPipeline), (GstElement*) m_pVideoDeinterlace );
2662  }
2663  }
2664 
2665  m_pColorSpace = gst_element_factory_make (VIDEOCONVERT, "color");
2666  if (m_pColorSpace) {
2667  MODebug2->Message(moText("moGsGraph:: created videoconvert for final color!") ) ;
2668  res = gst_bin_add (GST_BIN (m_pGstPipeline), (GstElement*) m_pColorSpace );
2669  }
2670 
2671  m_pCapsFilter = gst_element_factory_make ("capsfilter", "filt");
2672  if (m_pCapsFilter) {
2673  MODebug2->Message(moText("moGsGraph:: created last capsfilter!") ) ;
2674 #ifndef GSTVERSION
2675  g_object_set (G_OBJECT (m_pCapsFilter), "caps", gst_caps_new_simple ("video/x-raw-rgb",
2676  "bpp", G_TYPE_INT, 24,
2677  "depth", G_TYPE_INT, 24,
2678  "red_mask",G_TYPE_INT, 255,
2679  "green_mask",G_TYPE_INT, 65280,
2680  "blue_mask",G_TYPE_INT, 16711680,
2681  "endianness", G_TYPE_INT, 4321,
2682  NULL), NULL);
2683  res = gst_bin_add (GST_BIN (m_pGstPipeline), (GstElement*) m_pCapsFilter );
2684 #else
2685  caps = gst_caps_new_simple ( "video/x-raw",
2686  "format", G_TYPE_STRING, "RGB",
2687  NULL);
2688  g_object_set (G_OBJECT (m_pCapsFilter), "caps", caps, NULL);
2689  res = gst_bin_add (GST_BIN (m_pGstPipeline), (GstElement*) m_pCapsFilter );
2690 
2691 #endif
2692  //depth=(int)24, red_mask=(int)16711680, green_mask=(int)65280, blue_mask=(int)255, endianness=(int)4321
2693 
2694  }
2695 
2696  //RetreivePads( m_pFileSource );
2697 /*
2698  m_pAudioConverter = gst_element_factory_make ("audioresample", "resample");
2699 
2700  if (m_pAudioConverter) {
2701  res = gst_bin_add (GST_BIN (m_pGstPipeline), (GstElement*) m_pAudioConverter );
2702  }
2703  */
2704 
2705 
2706  if (m_pDecoderBin==NULL) m_pDecoderBin = gst_element_factory_make ( DECODEBIN, "decoder");
2707  if (m_pDecoderBin) {
2708  MODebug2->Message(moText("moGsGraph:: created decoder bin! ") + DECODEBIN ) ;
2709  res = gst_bin_add (GST_BIN (m_pGstPipeline), (GstElement*) m_pDecoderBin );
2710 #ifndef GSTVERSION
2711  signal_newpad_id = g_signal_connect (m_pDecoderBin, "new-decoded-pad", G_CALLBACK (cb_newpad), (gpointer)this);
2712 #else
2713  signal_newpad_id = g_signal_connect (m_pDecoderBin, "pad-added", G_CALLBACK (cb_pad_added_new), (gpointer)this);
2714  MODebug2->Message( moText("moGsGraph:: added signal to Decoder Bin, \"pad-added\": ") + IntToStr(signal_newpad_id) ) ;
2715 #endif
2716 
2717 #ifndef GSTVERSION
2718  m_pFakeSink = gst_element_factory_make ("fakesink", "destout");
2719 #else
2720  cout << "creating FakeSink from appsink" << endl;
2721  m_pFakeSink = gst_element_factory_make ("appsink", "destout");
2722 #endif
2723  //RetreivePads( m_pFakeSink );
2724  if (m_pFakeSink) {
2725  MODebug2->Message(moText("moGsGraph:: created FakeSink! ") ) ;
2726 #ifdef GSTVERSION
2727  g_object_set (G_OBJECT (m_pFakeSink), "caps", caps, NULL);
2728  g_object_set (G_OBJECT (m_pFakeSink), "sync", false, NULL);
2729  g_object_set (G_OBJECT (m_pFakeSink), "drop", true, NULL);
2730 #endif
2731  res = gst_bin_add (GST_BIN (m_pGstPipeline), (GstElement*) m_pFakeSink );
2732 
2733 
2734  MODebug2->Message(moText("moGsGraph:: Try linkage!! sourceselect?: ") + IntToStr(b_sourceselect) ) ;
2735  //b_sourceselect = true;
2736  if (b_sourceselect) {
2737  cout << "linking m_pFinalSource, m_pCapsFilterSource, m_pDecoderBin" << endl;
2738  if (b_forcevideoflip) {
2739  link_result = gst_element_link_many( (GstElement*) m_pFinalSource, (GstElement*) m_pCapsFilterSource,
2741  (GstElement*) m_pVideoFlip,
2742  (GstElement*) m_pDecoderBin, NULL );
2743  } else {
2744  link_result = gst_element_link_many( (GstElement*) m_pFinalSource, (GstElement*) m_pCapsFilterSource,
2746  (GstElement*) m_pDecoderBin, NULL );
2747  }
2748  } else {
2749  if (b_forcevideoflip) {
2750  link_result = gst_element_link_many( (GstElement*) m_pFinalSource,
2751  (GstElement*)m_pVideoFlip,
2752  (GstElement*) m_pDecoderBin, NULL );
2753  } else {
2754  cout << "linking m_pFinalSource, m_pDecoderBin" << endl;
2755  link_result = gst_element_link_many( (GstElement*) m_pFinalSource,
2756  (GstElement*) m_pDecoderBin, NULL );
2757  }
2758  }
2759 
2760 
2761  if (link_result) {
2762  MODebug2->Message(moText("moGsGraph:: Source linkage ok! ")+devinfo ) ;
2763  if (b_forcevideoscale) {
2764  cout << "linking forcing videoscale" << endl;
2765  if (b_forcevideointerlace)
2766  link_result = gst_element_link_many( (GstElement*) m_pVideoScale, (GstElement*)m_pCapsFilter2, (GstElement*) m_pColorSpaceInterlace, (GstElement*) m_pVideoDeinterlace, (GstElement*) m_pColorSpace, (GstElement*) m_pCapsFilter, (GstElement*) m_pFakeSink, NULL );
2767  else
2768  link_result = gst_element_link_many( (GstElement*) m_pVideoScale, (GstElement*)m_pCapsFilter2, (GstElement*) m_pColorSpace, (GstElement*) m_pCapsFilter, (GstElement*) m_pFakeSink, NULL );
2769 
2770  //old deinterlace
2771  //link_result = gst_element_link_many( (GstElement*) m_pVideoDeinterlace, (GstElement*) m_pVideoScale, (GstElement*)m_pCapsFilter2, (GstElement*) m_pColorSpace, (GstElement*) m_pCapsFilter, (GstElement*) m_pFakeSink, NULL );
2772  } else {
2773  cout << "linking no videoscale" << endl;
2774  //link_result = gst_element_link_many( (GstElement*) m_pVideoDeinterlace, (GstElement*) m_pColorSpace, (GstElement*) m_pCapsFilter, (GstElement*) m_pFakeSink, NULL );
2775  if (b_forcevideointerlace) {
2776  cout << "linking m_pColorSpaceInterlace, m_pVideoDeinterlace, m_pColorSpace, m_pCapsFilter, m_pFakeSink" << endl;
2777  link_result = gst_element_link_many( (GstElement*) m_pColorSpaceInterlace,
2778  (GstElement*) m_pVideoDeinterlace,
2779  (GstElement*)m_pColorSpace,
2780  (GstElement*) m_pCapsFilter,
2781  (GstElement*) m_pFakeSink,
2782  NULL );
2783  } else {
2784  cout << "linking m_pColorSpace, /*m_pCapsFilter*/, m_pFakeSink" << endl;
2785  link_result = gst_element_link_many(
2786  (GstElement*) m_pColorSpace,
2787 #ifndef GSTVERSION
2788  (GstElement*) m_pCapsFilter,
2789 #endif
2790  (GstElement*) m_pFakeSink, NULL );
2791 
2792 
2793  }
2794  //link_result = gst_element_link_filtered( (GstElement*) m_pColorSpace, (GstElement*) m_pFakeSink, NULL );
2795  //link_result = gst_element_link_many( (GstElement*) m_pColorSpace, (GstElement*) m_pCapsFilter, (GstElement*) m_pFakeSink, NULL );
2796  }
2797 
2798  if (link_result) {
2799  MODebug2->Message( moText("moGsGraph::BuildLiveWebcamGraph > play pipeline > ")+devinfo);
2800  bool ret = CheckState( gst_element_set_state ((GstElement*) m_pGstPipeline, GST_STATE_PLAYING), true /*SYNCRUNASLI*/ );
2801  if (ret==false) ret = CheckState( gst_element_set_state ((GstElement*) m_pGstPipeline, GST_STATE_PLAYING), true /*SYNCRUNASLI*/ );
2802  if (ret==false) {
2803  MODebug2->Error( moText("moGsGraph::BuildLiveWebcamGraph > No playing. ")+devinfo);
2804  } else {
2805  MODebug2->Message( moText("moGsGraph::BuildLiveWebcamGraph > GST_STATE_PLAYING > OK.")+devinfo);
2806  }
2807 
2808  //GetState();
2809 
2810 #ifdef GSTVERSION
2811 
2812  GstSample *sample=NULL;
2813  //g_signal_emit_by_name ( m_pFakeSink, "pull-sample", &sample, NULL);
2814 
2815  if (ret) {
2816  MODebug2->Message( moText("moGsGraph::BuildLiveWebcamGraph > gst_app_sink_pull_preroll for appsink. ")+devinfo);
2817  sample = gst_app_sink_pull_preroll( (GstAppSink*) m_pFakeSink );
2818  }
2819 
2820  if (sample) {
2821  MODebug2->Message( moText("moGsGraph::BuildLiveWebcamGraph > RECEIVED sample from gst_app_sink_pull_preroll!")+devinfo);
2822  GstBuffer *Gbuffer;
2823  GstCaps *bcaps;
2824  GstStructure *bstr;
2825 
2830  bcaps = gst_sample_get_caps( sample );
2831  if (bcaps) {
2832  Gbuffer = gst_sample_get_buffer (sample);
2833  SetVideoFormat( bcaps, Gbuffer );
2834  gst_app_sink_set_emit_signals((GstAppSink*)m_pFakeSink, true);
2835  gst_app_sink_set_drop((GstAppSink*)m_pFakeSink, true);
2836  gst_app_sink_set_wait_on_eos ((GstAppSink*)m_pFakeSink, false);
2837  //g_object_set (G_OBJECT (m_pFakeSink), "sync", false, NULL);
2838  gst_app_sink_set_max_buffers((GstAppSink*)m_pFakeSink, 1);
2839  g_signal_connect( (GstElement*)m_pFakeSink, "new-sample", G_CALLBACK (appsink_new_sample), (gpointer)this );
2840  //gst_app_sink_set_callbacks( (GstAppSink*)m_pFakeSink, )
2841 
2842  }
2843  } else MODebug2->Error( moText("moGsGraph::BuildLiveWebcamGraph > NO sample from gst_app_sink_pull_preroll!")+devinfo);
2844 
2845  MODebug2->Message( moText("moGsGraph::BuildLiveWebcamGraph > gst_app_sink_pull_preroll for appsink ended.")+devinfo);
2846 #else
2847  WaitForFormatDefinition( 1600 );
2848 #endif
2849 
2850  MODebug2->Message( moText("moGsGraph::BuildLiveWebcamGraph > graph builded.")+devinfo);
2851  //cout << "state gstreamer finish" << endl;
2852 
2853  //event_loop( (GstElement*) m_pGstPipeline, false, GST_STATE_PAUSED);
2854 
2855  return true;
2856 
2857  } else {
2858  MODebug2->Error(moText("moGsGraph::BuildLiveWebcamGraph > m_pColorSpace m_pCapsFilter m_pFakeSink linking failed.")+devinfo);
2859  event_loop( (GstElement*) m_pGstPipeline, false, GST_STATE_PAUSED);
2860  }
2861  } else {
2862  MODebug2->Error(moText("moGsGraph::BuildLiveWebcamGraph > src and decodebin linkage failed:")+devinfo);
2863  event_loop( (GstElement*) m_pGstPipeline, false, GST_STATE_PAUSED);
2864  }
2865 
2866  } else {
2867  MODebug2->Error(moText("moGsGraph::BuildLiveWebcamGraph > fakesink construction failed.")+devinfo);
2868  event_loop( (GstElement*) m_pGstPipeline, false, GST_STATE_PAUSED);
2869  }
2870  } else {
2871  MODebug2->Error(moText("moGsGraph::BuildLiveWebcamGraph > decodebin construction failed.")+devinfo);
2872  event_loop( (GstElement*) m_pGstPipeline, false, GST_STATE_PAUSED);
2873  }
2874  } else {
2875  MODebug2->Error(moText("moGsGraph::BuildLiveWebcamGraph > final source failed.")+devinfo);
2876  event_loop( (GstElement*) m_pGstPipeline, false, GST_STATE_PAUSED);
2877  }
2878  return false;
2879 
2880 
2881  }
2882 
2883  return true;
2884 }
2885 
2886 bool moGsGraph::BuildLiveQTVideoGraph( moText filename , moBucketsPool *pBucketsPool ) {
2887 
2888  return BuildLiveVideoGraph( filename, pBucketsPool );
2889 
2890 }
2891 
2892 
2893 void
2894 moGsGraph::RetreivePads( moGstElement* FilterElement) {
2895 
2896  GstIterator* piter;
2897  GstPad* ppad;
2898  gchar* nname;
2899 #ifndef GSTVERSION
2900  gpointer ppointer;
2901 #else
2902  GValue gvalue = G_VALUE_INIT;
2903 #endif
2904  bool done;
2905  bool res = false;
2906 
2907  piter = gst_element_iterate_pads( (GstElement*)FilterElement );
2908 
2909  done = FALSE;
2910  while (!done) {
2911 #ifndef GSTVERSION
2912  switch (gst_iterator_next (piter, &ppointer)) {
2913 #else
2914  switch (gst_iterator_next (piter, &gvalue)) {
2915 #endif
2916  case GST_ITERATOR_OK:
2917  //... use/change item here...
2918 #ifndef GSTVERSION
2919  ppad = (GstPad*) ppointer;
2920 #else
2921  ppad = (GstPad*) g_value_dup_object( &gvalue );
2922 #endif
2923  nname = gst_pad_get_name(ppad);
2924  res = gst_pad_is_active(ppad);
2925  res = gst_pad_is_linked(ppad);
2926  res = gst_pad_is_blocking(ppad);
2927 #ifndef GSTVERSION
2928  gst_object_unref (ppointer);
2929 #else
2930  g_value_reset( &gvalue );
2931 #endif
2932  break;
2933 
2934  case GST_ITERATOR_RESYNC:
2935  //...rollback changes to items...
2936  gst_iterator_resync (piter);
2937  break;
2938 
2939  case GST_ITERATOR_ERROR:
2940  //...wrong parameter were given...
2941  done = TRUE;
2942  break;
2943 
2944  case GST_ITERATOR_DONE:
2945  done = TRUE;
2946  break;
2947  }
2948  }
2949  gst_iterator_free (piter);done = FALSE;
2950 
2951  return;
2952 }
2953 /*
2954 bool
2955 moGsGraph::BuildTestGraph( moBucketsPool *pBucketsPool ) {
2956 typedef enum {
2957  GST_VIDEO_TEST_SRC_SMPTE,
2958  GST_VIDEO_TEST_SRC_SNOW,
2959  GST_VIDEO_TEST_SRC_BLACK,
2960  GST_VIDEO_TEST_SRC_WHITE,
2961  GST_VIDEO_TEST_SRC_RED,
2962  GST_VIDEO_TEST_SRC_GREEN,
2963  GST_VIDEO_TEST_SRC_BLUE,
2964  GST_VIDEO_TEST_SRC_CHECKERS1,
2965  GST_VIDEO_TEST_SRC_CHECKERS2,
2966  GST_VIDEO_TEST_SRC_CHECKERS4,
2967  GST_VIDEO_TEST_SRC_CHECKERS8,
2968  GST_VIDEO_TEST_SRC_CIRCULAR,
2969  GST_VIDEO_TEST_SRC_BLINK
2970 } GstVideoTestSrcPattern;
2971 
2972 }*/
2973 
2974 void
2976 
2977  MOulong time0 = moGetTicksAbsolute();
2978  MOulong time1 = time0;
2979 
2980  //cout << "waiting for format definition..." << timeout << endl;
2981 
2982  while((time1 - time0) < timeout) {
2984  return;
2985  }
2986  time1 = moGetTicksAbsolute();
2987  //cout << (time1 - time0) << endl;
2988  continue;
2989  }
2990  //cout << "elapsed:" << (time1 - time0) << "m_WaitForFormat:" << m_VideoFormat.m_WaitForFormat << "w:" << m_VideoFormat.m_Width << " x h:" << m_VideoFormat.m_Height << endl;
2991  MODebug2->Error("moGsGraph::WaitForFormatDefinition > time out !!! " + IntToStr(timeout) + " ms elapsed!");
2992 }
2993 
2994 
2995 bool moGsGraph::BuildLiveSound( moText filename ) {
2996 
2997  bool link_result = false;
2998 // gchar* checkval;
2999  bool res = false;
3000 
3001  MODebug2->Push( moText("Building live sound:") + (moText)filename);
3002 
3003  moFile SoundFile( filename );
3004 
3005  if ( !SoundFile.Exists() ) return false;
3006 
3007  if (filename.Length()>0)
3008  {
3009 
3010  moText extension = filename;
3011  extension.Right(4);
3012 
3013  m_pFileSource = gst_element_factory_make ("filesrc", "source");
3014 
3015  if (m_pFileSource) {
3016 
3017  g_object_set (G_OBJECT (m_pFileSource), "location", (char*)filename/*("///home/fabri/jp5.avi")*/, NULL);
3018 
3019  res = gst_bin_add (GST_BIN ((GstElement*)m_pGstPipeline), (GstElement*)m_pFileSource );
3020 
3021  m_pAudioConverter = NULL;
3022 /*
3023 
3024 
3025  if (m_pAudioConverter) {
3026  res = gst_bin_add (GST_BIN (m_pGstPipeline), m_pAudioConverter );
3027  }
3028 */
3029  if (extension==moText(".wav")) {
3030  m_pAudioConverter = gst_element_factory_make ("audioresample", "resample");
3031  // MODebug2->Push( "moGsGraph:: wav file" );
3032  } else {
3033  m_pAudioConverter = gst_element_factory_make ("audioconvert", "converter");
3034  }
3035 
3036  if (m_pAudioConverter) {
3037  res = gst_bin_add (GST_BIN ((GstElement*)m_pGstPipeline), (GstElement*)m_pAudioConverter );
3038  }
3039 
3040  m_pAudioSink = gst_element_factory_make ("autoaudiosink", "audioout");
3041 
3042  if (m_pAudioSink) {
3043  res = gst_bin_add (GST_BIN ((GstElement*)m_pGstPipeline), (GstElement*)m_pAudioSink );
3044  }
3045 
3050  m_pAudioSpeed = gst_element_factory_make ("speed", "speed");
3051 
3052  if (m_pAudioSpeed) {
3053  res = gst_bin_add (GST_BIN ((GstElement*)m_pGstPipeline), (GstElement*)m_pAudioSpeed );
3054  }
3055 
3056  m_pAudioVolume = gst_element_factory_make ("volume", "volume");
3057 
3058  if (m_pAudioVolume) {
3059  res = gst_bin_add (GST_BIN ((GstElement*)m_pGstPipeline), (GstElement*)m_pAudioVolume );
3060  }
3061 
3062  m_pAudioPanorama = gst_element_factory_make ("audiopanorama", "audiopanorama");
3063 
3064  if (m_pAudioPanorama) {
3065  res = gst_bin_add (GST_BIN ((GstElement*)m_pGstPipeline), (GstElement*)m_pAudioPanorama );
3066  }
3067 
3068  m_pAudioConverter2 = gst_element_factory_make ("audioconvert", "audioconvert2");
3069 
3070  if (m_pAudioConverter2) {
3071  res = gst_bin_add (GST_BIN ((GstElement*)m_pGstPipeline), (GstElement*)m_pAudioConverter2 );
3072  }
3073 
3074  m_pAudioConverter3 = gst_element_factory_make ("audioconvert", "audioconvert3");
3075 
3076  if (m_pAudioConverter3) {
3077  res = gst_bin_add (GST_BIN ((GstElement*)m_pGstPipeline), (GstElement*)m_pAudioConverter3 );
3078  }
3079 
3080 /*
3081  m_pAudioEcho = gst_element_factory_make ("audioecho", "audioecho");
3082 
3083 
3084  if (m_pAudioEcho) {
3085  res = gst_bin_add (GST_BIN ((GstElement*)m_pGstPipeline), (GstElement*)m_pAudioEcho );
3086  unsigned long long max_delay,delay;
3087  max_delay = 2000000000;
3088  delay = 0;
3089  float intensity = 0.0;
3090 
3091  g_object_set ( (GstElement*)m_pAudioEcho, "max-delay", max_delay, NULL);
3092  g_object_set ( (GstElement*)m_pAudioEcho, "delay", delay, NULL);
3093  g_object_set ( (GstElement*)m_pAudioEcho, "intensity", intensity, NULL);
3094  }
3095 */
3096 
3097  m_pAudioConverter4 = gst_element_factory_make ("audioconvert", "audioconvert4");
3098 
3099  if (m_pAudioConverter4) {
3100  res = gst_bin_add (GST_BIN ((GstElement*)m_pGstPipeline), (GstElement*)m_pAudioConverter4 );
3101  }
3102 
3103  m_pDecoderBin = gst_element_factory_make ( DECODEBIN, "decoder");
3104  if (m_pDecoderBin) {
3105 #ifndef GSTVERSION
3106  signal_newpad_id = g_signal_connect ((GstElement*)m_pDecoderBin, "new-decoded-pad", G_CALLBACK (cb_newpad), (gpointer)this);
3107 #else
3108  signal_newpad_id = g_signal_connect ((GstElement*)m_pDecoderBin, "pad-added", G_CALLBACK (cb_pad_added_new), (gpointer)this);
3109 #endif
3110  res = gst_bin_add (GST_BIN ((GstElement*)m_pGstPipeline), (GstElement*)m_pDecoderBin );
3111  }
3112 
3113 
3114  //signal_id = g_signal_connect (m_pWavParser, "new-decoded-pad", G_CALLBACK (cb_newpad), (gpointer)this);
3115  link_result = gst_element_link_many( (GstElement*)m_pFileSource, (GstElement*)m_pDecoderBin, NULL );
3116 
3117  if (link_result) {
3118  /*
3119  if (m_pAudioConverter) link_result = gst_element_link_many(
3120  (GstElement*)m_pAudioConverter,
3121  (GstElement*)m_pAudioSpeed,
3122  (GstElement*)m_pAudioConverter2,
3123  (GstElement*)m_pAudioPanorama,
3124  (GstElement*)m_pAudioConverter3,
3125  (GstElement*)m_pAudioEcho,
3126  (GstElement*)m_pAudioConverter4,
3127  (GstElement*)m_pAudioVolume,
3128  (GstElement*)m_pAudioSink,
3129  NULL
3130  );
3131  */
3132  if (m_pAudioConverter) link_result = gst_element_link_many(
3133  (GstElement*)m_pAudioConverter,
3134  (GstElement*)m_pAudioSpeed,
3135  (GstElement*)m_pAudioConverter2,
3136  (GstElement*)m_pAudioPanorama,
3137  (GstElement*)m_pAudioConverter3,
3138  (GstElement*)m_pAudioVolume,
3139  (GstElement*)m_pAudioConverter4,
3140  (GstElement*)m_pAudioSink,
3141  NULL
3142  );
3143  //else link_result = gst_element_link_many( (GstElement*)m_pAudioSink, NULL );
3144 
3145  if (link_result) {
3146 
3147  CheckState( gst_element_set_state ((GstElement*)m_pGstPipeline, GST_STATE_PAUSED), true /*SYNCRUNASLI*/ );
3148 
3149  //WaitForFormatDefinition( 1600 );
3150 
3151  cout << "state gstreamer finish" << endl;
3152 
3153  return true;
3154 
3155  } else {
3156  MODebug2->Error(moText("moGsGraph::error: m_pAudioConverter m_pAudioResample m_pAudioSink linking failed"));
3157  event_loop( (GstElement*)m_pGstPipeline, false, GST_STATE_PAUSED);
3158  }
3159  } else {
3160  MODebug2->Error(moText("moGsGraph::error: m_pFileSource m_pWavParser linking failed"));
3161  event_loop( (GstElement*)m_pGstPipeline, false, GST_STATE_PAUSED);
3162  }
3163 
3164  }
3165 
3166 
3167 
3168 
3169  }
3170 
3171  return false;
3172 }
3173 
3174 
3176 
3177  //BuildLock.Lock();
3178  bool res = false;
3179 
3180  if (m_pGstPipeline) {
3181  m_pAudioConverter = gst_element_factory_make ("audioconvert", "convert");
3182 
3183  if (m_pAudioConverter) {
3184  res = gst_bin_add (GST_BIN ((GstElement*)m_pGstPipeline), (GstElement*)m_pAudioConverter );
3185  }
3186 
3187  m_pAudioVolume = gst_element_factory_make ("volume", "volume");
3188 
3189  if (m_pAudioVolume) {
3190  res = gst_bin_add (GST_BIN ((GstElement*)m_pGstPipeline), (GstElement*)m_pAudioVolume );
3191  }
3192 
3193  m_pAudioPanorama = gst_element_factory_make ("audiopanorama", "balance");
3194 
3195  if (m_pAudioPanorama) {
3196  res = gst_bin_add (GST_BIN ((GstElement*)m_pGstPipeline), (GstElement*)m_pAudioPanorama );
3197  }
3198 
3199  m_pAudioSink = gst_element_factory_make ("autoaudiosink", "audioout");
3200 
3201  if (m_pAudioSink) {
3202  res = gst_bin_add (GST_BIN ((GstElement*)m_pGstPipeline), (GstElement*)m_pAudioSink );
3203  }
3204 
3205  bool link_audio_result = gst_element_link_many( (GstElement*)m_pAudioConverter, (GstElement*)m_pAudioVolume, (GstElement*)m_pAudioPanorama, (GstElement*)m_pAudioSink, NULL );
3206  MODebug2->Message("BuildAudioFilters: link_audio_result: "+IntToStr(int(link_audio_result)) );
3207  }
3208  //BuildLock.Unlock();
3209 
3210 }
3211 
3212 
3213 bool moGsGraph::BuildLiveVideoGraph( moText filename , moBucketsPool *pBucketsPool ) {
3214 
3215  m_pBucketsPool = pBucketsPool;
3216  bool link_result = false;
3217 // gchar* checkval;
3218  bool res = false;
3219 
3220  moFile VideoFile( filename );
3221 
3222  if ( !VideoFile.Exists() ) return false;
3223 
3224  //if (filename.Length()>0)
3225  {
3226 
3227  m_pFileSource = gst_element_factory_make ("filesrc", "source");
3228 
3229  if (m_pFileSource) {
3230  g_object_set (G_OBJECT (m_pFileSource), "location", (char*)filename/*("///home/fabri/jp5.avi")*/, NULL);
3231  //g_object_get (G_OBJECT (m_pFileSource), "location", &checkval, NULL);
3232  //GstElement *filter = gst_element_factory_make ("capsfilter", "filter");
3233  //g_object_set (G_OBJECT (m_pFileSource), "pattern", GST_VIDEO_TEST_SRC_SNOW, NULL);
3234  //res = gst_pad_set_caps( gst_element_get_pad( m_pFileSource, "src" ), NULL);
3235 
3236 
3237  res = gst_bin_add (GST_BIN ((GstElement*)m_pGstPipeline), (GstElement*)m_pFileSource );
3238 
3239 
3240  m_pColorSpaceInterlace = gst_element_factory_make (VIDEOCONVERT, "color0");
3241  if (m_pColorSpaceInterlace) {
3242  res = gst_bin_add (GST_BIN ((GstElement*)m_pGstPipeline), (GstElement*)m_pColorSpaceInterlace );
3243  }
3244 
3245  m_pVideoBalance = gst_element_factory_make ("videobalance", "videobalance");
3246  if (m_pVideoBalance) {
3247  res = gst_bin_add (GST_BIN ((GstElement*)m_pGstPipeline), (GstElement*)m_pVideoBalance );
3248  }
3249 
3250  m_pColorSpace = gst_element_factory_make (VIDEOCONVERT, "color");
3251  if (m_pColorSpace) {
3252  res = gst_bin_add (GST_BIN ((GstElement*)m_pGstPipeline), (GstElement*)m_pColorSpace );
3253  }
3254 /*
3255  m_pCapsFilter = gst_element_factory_make ("capsfilter", "filt");
3256  if (m_pCapsFilter) {
3257 #ifndef GSTVERSION
3258 
3259  g_object_set (G_OBJECT ((GstElement*)m_pCapsFilter), "caps", gst_caps_new_simple ("video/x-raw-rgb",
3260  "bpp", G_TYPE_INT, 24,
3261  "depth", G_TYPE_INT, 24,
3262  "red_mask",G_TYPE_INT, 255,
3263  "green_mask",G_TYPE_INT, 65280,
3264  "blue_mask",G_TYPE_INT, 16711680,
3265  "endianness", G_TYPE_INT, 4321,
3266  NULL), NULL);
3267  //depth=(int)24, red_mask=(int)16711680, green_mask=(int)65280, blue_mask=(int)255, endianness=(int)4321
3268  res = gst_bin_add (GST_BIN ((GstElement*)m_pGstPipeline), (GstElement*)m_pCapsFilter );
3269 
3270 #else
3271  g_object_set (G_OBJECT (m_pCapsFilter), "caps", gst_caps_new_simple ( "video/x-raw",
3272  "format", G_TYPE_STRING, "RGB",
3273  NULL), NULL);
3274  res = gst_bin_add (GST_BIN ((GstElement*)m_pGstPipeline), (GstElement*)m_pCapsFilter ); NULL), NULL);
3275 #endif
3276 
3277  }*/
3278  //RetreivePads( m_pFileSource );
3279 
3281  //BuildAudioFilters();
3282 
3284 
3285  m_pDecoderBin = gst_element_factory_make ( DECODEBIN, "decoder");
3286  if (m_pDecoderBin) {
3287  res = gst_bin_add (GST_BIN ((GstElement*)m_pGstPipeline), (GstElement*)m_pDecoderBin );
3288 
3289  m_pFakeSink = gst_element_factory_make ("fakesink", "destout");
3290 #ifndef GSTVERSION
3291  //signal_newpad_id = g_signal_connect (m_pDecoderBin, "pad-added", G_CALLBACK (cb_pad_added), (gpointer)this);
3292  signal_newpad_id = g_signal_connect (m_pDecoderBin, "new-decoded-pad", G_CALLBACK (cb_newpad), (gpointer)this);
3293 #else
3294  signal_newpad_id = g_signal_connect (m_pDecoderBin, "pad-added", G_CALLBACK (cb_pad_added_new), (gpointer)this);
3295  MODebug2->Message( moText("moGsGraph:: added signal to Decoder Bin, \"pad-added\": ") + IntToStr(signal_newpad_id) ) ;
3296 #endif
3297 
3298 #ifndef GSTVERSION
3299  m_pFakeSink = gst_element_factory_make ("fakesink", "destout");
3300 #else
3301  cout << "creating FakeSink from appsink" << endl;
3302  m_pFakeSink = gst_element_factory_make ("appsink", "destout");
3303 #endif
3304  //RetreivePads( m_pFakeSink );
3305  if (m_pFakeSink) {
3306  MODebug2->Message(moText("moGsGraph:: created FakeSink! ") ) ;
3307 #ifdef GSTVERSION
3308  g_object_set (G_OBJECT (m_pFakeSink), "caps", gst_caps_new_simple ( "video/x-raw",
3309  "format", G_TYPE_STRING, "RGB",
3310  NULL), NULL);
3311  g_object_set (G_OBJECT (m_pFakeSink), "sync", (bool)true, NULL);
3312  g_object_set (G_OBJECT (m_pFakeSink), "drop", true, NULL);
3313  //gst_app_sink_set_emit_signals( (GstAppSink*)m_pFakeSink, true);
3314  gst_app_sink_set_max_buffers( (GstAppSink*)m_pFakeSink, 100 );
3315 #else
3316  g_object_set (G_OBJECT (m_pFakeSink), "sync", (bool)true, NULL);
3318 
3319 #endif
3320  res = gst_bin_add (GST_BIN ((GstElement*)m_pGstPipeline), (GstElement*)m_pFakeSink );
3321 
3322  link_result = gst_element_link_many( (GstElement*)m_pFileSource, (GstElement*)m_pDecoderBin, NULL );
3323  if (link_result) {
3324 #ifndef GSTVERSION
3325  if (m_pVideoBalance)
3326  link_result = gst_element_link_many( (GstElement*)m_pColorSpaceInterlace, (GstElement*)m_pVideoBalance, (GstElement*)m_pColorSpace, (GstElement*)m_pCapsFilter, (GstElement*)m_pFakeSink, NULL );
3327  else
3328  link_result = gst_element_link_many( (GstElement*)m_pColorSpaceInterlace, (GstElement*)m_pColorSpace, (GstElement*)m_pCapsFilter, (GstElement*)m_pFakeSink, NULL );
3329 #else
3330  if (m_pVideoBalance)
3331  link_result = gst_element_link_many( (GstElement*)m_pColorSpaceInterlace, (GstElement*)m_pVideoBalance, (GstElement*)m_pColorSpace, (GstElement*)m_pFakeSink, NULL );
3332  else
3333  link_result = gst_element_link_many( (GstElement*)m_pColorSpaceInterlace, (GstElement*)m_pColorSpace, (GstElement*)m_pFakeSink, NULL );
3334 #endif
3335 
3337  //if (m_pAudioConverter)
3338  // bool link_audio_result = gst_element_link_many( (GstElement*)m_pAudioConverter, (GstElement*)m_pAudioVolume, (GstElement*)m_pAudioPanorama, (GstElement*)m_pAudioSink, NULL );
3339 
3340  if (link_result) {
3341 
3342  CheckState( gst_element_set_state ((GstElement*)m_pGstPipeline, GST_STATE_PAUSED), true /*SYNCRUNASLI*/ );
3343  MODebug2->Message( moText("moGsGraph::BuildLiveVideoGraph > GST_STATE_PAUSED > OK"));
3344  //CheckState( gst_element_set_state ((GstElement*)m_pGstPipeline, GST_STATE_NULL), true /*SYNCRUNASLI*/ );
3345  //MODebug2->Message( moText("moGsGraph::BuildLiveVideoGraph > GST_STATE_NULL > OK"));
3349 #ifdef GSTVERSION
3350  GstSample *sample;
3351  MODebug2->Message( moText("moGsGraph::BuildLiveVideoGraph > gst_app_sink_pull_preroll for appsink"));
3352  //g_signal_emit_by_name ( m_pFakeSink, "pull-sample", &sample, NULL);
3353 
3354  sample = gst_app_sink_pull_preroll( (GstAppSink*) m_pFakeSink );
3355  if (sample) {
3356  GstBuffer *Gbuffer;
3357  GstCaps *bcaps;
3358  GstStructure *bstr;
3359 
3364  bcaps = gst_sample_get_caps( sample );
3365  if (bcaps) {
3366  Gbuffer = gst_sample_get_buffer (sample);
3367  SetVideoFormat( bcaps, Gbuffer );
3368  gst_app_sink_set_emit_signals((GstAppSink*)m_pFakeSink, true);
3369  gst_app_sink_set_drop((GstAppSink*)m_pFakeSink, true);
3370  gst_app_sink_set_wait_on_eos ((GstAppSink*)m_pFakeSink, false);
3371  //g_object_set (G_OBJECT (m_pFakeSink), "sync", false, NULL);
3372  gst_app_sink_set_max_buffers((GstAppSink*)m_pFakeSink, 100 );
3373  #ifndef USING_SYNC_FRAMEBUFFER
3374  g_signal_connect( (GstElement*)m_pFakeSink, "new-sample", G_CALLBACK (appsink_new_sample), (gpointer)this );
3375  #endif
3376  //g_signal_connect( (GstElement*)m_pFakeSink, "new-sample", G_CALLBACK (appsink_new_sample), (gpointer)this );
3377  //gst_app_sink_set_callbacks( (GstAppSink*)m_pFakeSink, )
3378  //g_signal_connect (G_OBJECT (bus), "message::error", (GCallback)error_cb, &data);
3379 
3380  }
3381  } else {
3382  MODebug2->Error( moText("moGsGraph::BuildLiveVideoGraph > no sample!"));
3383  cout << "gst_app_sink_is_eos: " << gst_app_sink_is_eos((GstAppSink*)m_pFakeSink) << endl;
3384  cout << "gst_app_sink_get_emit_signals: " << gst_app_sink_get_emit_signals((GstAppSink*)m_pFakeSink) << endl;
3385  cout << "gst_app_sink_get_max_buffers: " << gst_app_sink_get_max_buffers((GstAppSink*)m_pFakeSink) << endl;
3386  }
3387 #else
3388  WaitForFormatDefinition( 3000 );
3389 #endif
3390 
3391  MODebug2->Message( moText("moGsGraph::BuildLiveVideoGraph > graph builded"));
3392 
3393  //event_loop( (GstElement*)m_pGstPipeline, false, GST_STATE_PAUSED);
3394 
3395  return true;
3396 
3397  } else {
3398  MODebug2->Error( moText("moGsGraph::BuildLiveVideoGraph > m_pColorSpace m_pCapsFilter m_pFakeSink linking failed"));
3399  event_loop( (GstElement*)m_pGstPipeline, false, GST_STATE_PAUSED);
3400  }
3401  } else {
3402  MODebug2->Error( moText("moGsGraph::BuildLiveVideoGraph > filesrc and decodebin linkage failed: ") + filename );
3403  event_loop( (GstElement*)m_pGstPipeline, false, GST_STATE_PAUSED);
3404  }
3405 
3406  } else {
3407  MODebug2->Error( moText("moGsGraph::BuildLiveVideoGraph > fakesink construction failed"));
3408  event_loop( (GstElement*)m_pGstPipeline, false, GST_STATE_PAUSED);
3409  }
3410  } else {
3411  MODebug2->Error( moText("moGsGraph::BuildLiveVideoGraph > decodebin construction failed"));
3412  event_loop( (GstElement*)m_pGstPipeline, false, GST_STATE_PAUSED);
3413  }
3414  } else {
3415  MODebug2->Error( moText("moGsGraph::BuildLiveVideoGraph > file source failed: ") + filename);
3416  event_loop( (GstElement*)m_pGstPipeline, false, GST_STATE_PAUSED);
3417  }
3418  return false;
3419 
3420  /*
3421  GstPad *pad;
3422  pad = gst_element_get_pad (m_pDecoderBin, "src0");
3423  gst_pad_add_buffer_probe (pad, G_CALLBACK (cb_have_data), NULL);
3424  gst_object_unref (pad);
3425  */
3426 
3427 
3428  }
3429 
3430  /*SETTING SOURCE*/
3431  /*
3432  // set the source audio file
3433  g_object_set (player, "location", "helloworld.ogg", NULL);
3434  */
3435 
3436  /*more complex*/
3437  /*
3438  // create elements
3439  pipeline = gst_pipeline_new ("my_pipeline");
3440  source = gst_element_factory_make ("filesrc", "source");
3441  g_object_set (source, "location", argv[1], NULL);
3442  demux = gst_element_factory_make ("oggdemux", "demuxer");
3443 
3444  // you would normally check that the elements were created properly
3445 
3446  // put together a pipeline
3447  gst_bin_add_many (GST_BIN (pipeline), source, demux, NULL);
3448  gst_element_link_pads (source, "src", demux, "sink");
3449 
3450  // listen for newly created pads
3451  g_signal_connect (demux, "pad-added", G_CALLBACK (cb_new_pad), NULL);
3452 */
3453 
3454 /*
3455  // create elements
3456  source = gst_element_factory_make ("fakesrc", "source");
3457  filter = gst_element_factory_make ("identity", "filter");
3458  sink = gst_element_factory_make ("fakesink", "sink");
3459 
3460  // must add elements to pipeline before linking them
3461  gst_bin_add_many (GST_BIN (pipeline), source, filter, sink, NULL);
3462 
3463  // link
3464  if (!gst_element_link_many (source, filter, sink, NULL)) {
3465  g_warning ("Failed to link elements!");
3466  }
3467 
3468 */
3469 
3470 
3471 /*putting pipelines into pipelines!!!! */
3472 /*
3473  //
3474  pipeline = gst_pipeline_new ("my_pipeline");
3475  bin = gst_pipeline_new ("my_bin");
3476  source = gst_element_factory_make ("fakesrc", "source");
3477  sink = gst_element_factory_make ("fakesink", "sink");
3478 
3479  // set up pipeline
3480  gst_bin_add_many (GST_BIN (bin), source, sink, NULL);
3481  gst_bin_add (GST_BIN (pipeline), bin);
3482  gst_element_link (source, sink);
3483  */
3484 
3485 
3486 
3487  /*getting specific properties of caps*/
3488 /*
3489  static void
3490 read_video_props (GstCaps *caps)
3491 {
3492  gint width, height;
3493  const GstStructure *str;
3494 
3495  g_return_if_fail (gst_caps_is_fixed (caps));
3496 
3497  str = gst_caps_get_structure (caps, 0);
3498  if (!gst_structure_get_int (str, "width", &width) ||
3499  !gst_structure_get_int (str, "height", &height)) {
3500  g_print ("No width/height available\n");
3501  return;
3502  }
3503 
3504  g_print ("The video size of this set of capabilities is %dx%d\n",
3505  width, height);
3506 }
3507 */
3508 
3509 /*CREATE GHOST PAD FOR A BIN ATTACHED TO THE FIRST IN-PAD*/
3510 /*
3511  GstElement *bin, *sink;
3512  GstPad *pad;
3513 
3514  // init
3515  gst_init (&argc, &argv);
3516 
3517  // create element, add to bin
3518  sink = gst_element_factory_make ("fakesink", "sink");
3519  bin = gst_bin_new ("mybin");
3520  gst_bin_add (GST_BIN (bin), sink);
3521 
3522  //add ghostpad
3523  pad = gst_element_get_pad (sink, "sink");
3524  gst_element_add_pad (bin, gst_ghost_pad_new ("sink", pad));
3525  gst_object_unref (GST_OBJECT (pad));
3526 
3527 */
3528  return false;
3529 
3530 }
3531 
3532 
3533 /*
3534 tipicos de gstreamer:
3535 format(fourcc);
3536 bpp=(int)32,
3537 depth=(int)24,
3538 endianness=(int)4321,
3539 red_mask=(int)-16777216,
3540 green_mask=(int)16711680,
3541 blue_mask=(int)65280,
3542 width=(int)[ 2, 2147483647 ],
3543 height=(int)[ 2, 2147483647 ],
3544 framerate=(fraction)[ 0/1, 2147483647/1 ];
3545 */
3546 
3547 void
3549 
3550  bool isfixed = false;
3551  GstBuffer* Gbuffer = (GstBuffer*)buffer;
3552 
3553  isfixed = gst_caps_is_fixed((GstCaps*)caps);
3554 
3555 
3556  if (!isfixed) {
3557 
3558  return;
3559  }
3560 
3561  GstStructure* str;
3562  str = gst_caps_get_structure ((GstCaps*)caps, 0);
3563 
3564  const gchar *sstr;
3565 
3566  sstr = gst_structure_to_string (str);
3567 
3568  //cout << "SetVideoFormat: we have a format!!" << sstr << endl;
3569 
3570  if (g_strrstr( sstr, "channels" )) {
3571 
3572  //to calculate framerate
3573  gint width, depth;
3574  //gint value_numerator, value_denominator;
3575  gint channels, rate;
3576 
3577  gst_structure_get_int( str, "width", &width);
3578  gst_structure_get_int( str, "depth", &depth);
3579  gst_structure_get_int( str, "channels", &channels);
3580  gst_structure_get_int( str, "rate", &rate);
3581  //gst_structure_get_int( str, "height", &height);
3582  //gst_structure_get_fraction( str, "framerate", &value_numerator, &value_denominator );
3583 
3584  m_AudioFormat.m_Width = (MOuint)width;
3585  m_AudioFormat.m_Depth = (MOuint)depth;
3586  m_AudioFormat.m_Channels = (MOuint)channels;
3589 /*
3590  m_AudioFormat.m_Width = (MOuint)width;
3591  m_AudioFormat.m_Height = (MOuint)height;
3592  m_AudioFormat.m_FrameRate = (value_numerator * 100) / value_denominator;
3593  //cout << "Width:" << m_AudioFormat.m_Width << endl;
3594  //cout << "Height:" << m_AudioFormat.m_Height << endl;
3595  //cout << "Framerate:" << m_AudioFormat.m_FrameRate << endl;
3596 
3597  //m_AudioFormat.m_BitCount = pVih->bmiHeader.biBitCount;
3598  //m_AudioFormat.m_BitRate = pVih->dwBitRate;
3599  */
3600  if (Gbuffer!=NULL) {
3601  m_AudioFormat.m_TimePerSample = Gbuffer->duration;
3602 #ifndef GSTVERSION
3603  m_AudioFormat.m_BufferSize = Gbuffer->size;
3604 #else
3605  m_AudioFormat.m_BufferSize = gst_buffer_get_size( Gbuffer );
3606 #endif
3607  }
3608  //m_AudioFormat.SetVideoMode();
3610 
3611  }
3612 
3613  MODebug2->Message(
3614  "SetAudioFormat: we have a format!! "
3616  + " Channels, "
3618  + " Hz, "
3620  + " bits, "
3622  + " bytes per buffer, "
3624  + " nanoseconds per sample "
3625 
3626  );
3627 
3628 
3629 }
3630 
3631 void
3633 
3634  bool isfixed = false;
3635  GstBuffer* Gbuffer = (GstBuffer*)buffer;
3636 
3637  isfixed = gst_caps_is_fixed((GstCaps*)caps);
3638 
3639 
3640  if (!isfixed) {
3641 
3642  return;
3643  }
3644 
3645  GstStructure* str;
3646  str = gst_caps_get_structure ((GstCaps*)caps, 0);
3647 
3648  const gchar *sstr;
3649 
3650  sstr = gst_structure_to_string (str);
3651 
3652  //cout << "SetVideoFormat: we have a format!!" << sstr << endl;
3653 
3654  if (g_strrstr( sstr, "width" )) {
3655 
3656  //to calculate framerate
3657  gint width, height, value_numerator, value_denominator, redmask, greenmask, bluemask, bitcount;
3658 
3659  gst_structure_get_int( str, "width", &width);
3660  gst_structure_get_int( str, "height", &height);
3661  gst_structure_get_fraction( str, "framerate", &value_numerator, &value_denominator );
3662  gst_structure_get_int( str, "red_mask", &redmask );
3663  gst_structure_get_int( str, "green_mask", &greenmask );
3664  gst_structure_get_int( str, "blue_mask", &bluemask );
3665  gst_structure_get_int( str, "bpp", &bitcount );
3666 
3667  m_VideoFormat.m_Width = (MOuint)width;
3668  m_VideoFormat.m_Height = (MOuint)height;
3669  m_VideoFormat.m_FrameRate = (value_numerator * 100) / value_denominator;
3670  m_VideoFormat.m_RedMask = (MOuint) redmask;
3671  m_VideoFormat.m_GreenMask = (MOuint) greenmask;
3672  m_VideoFormat.m_BlueMask = (MOuint) bluemask;
3673  m_VideoFormat.m_BitCount = (MOuint) bitcount;
3674 
3675  //cout << "Width:" << m_VideoFormat.m_Width << endl;
3676  //cout << "Height:" << m_VideoFormat.m_Height << endl;
3677  //cout << "Framerate:" << m_VideoFormat.m_FrameRate << endl;
3678 
3679  //m_VideoFormat.m_BitCount = pVih->bmiHeader.biBitCount;
3680  //m_VideoFormat.m_BitRate = pVih->dwBitRate;
3681  if (buffer!=NULL) {
3682  m_VideoFormat.m_TimePerFrame = Gbuffer->duration;
3683 #ifndef GSTVERSION
3684  m_VideoFormat.m_BufferSize = Gbuffer->size;
3685 #else
3686  m_VideoFormat.m_BufferSize = gst_buffer_get_size( Gbuffer );
3687 #endif
3688  }
3691 
3695 
3696  }
3697 
3698  MODebug2->Message(
3699  "SetVideoFormat: we have a format!!"
3701  + " X "
3703  + " m_BitCount: "
3705  + " m_BufferSize: "
3707  + " buffer duration: "
3709  + " m_FrameRate: "
3711  + " m_RedMask: "
3713  + " m_GreenMask: "
3715  + " m_BlueMask: "
3717 
3718  );
3719 
3720 
3721 }
3722 
3723 
3724 
3725 
3726 
3727 
3728 /*
3729  *
3730 
3731  GST_STATE_NULL: this is the default state. This state will deallocate all resources held by the element.
3732  *
3733 
3734  GST_STATE_READY: in the ready state, an element has allocated all of its global resources, that is, resources that can be kept
3735  within streams. You can think about opening devices, allocating buffers and so on. However,
3736  the stream is not opened in this state, so the stream positions is automatically zero.
3737  If a stream was previously opened, it should be closed in this state, and position, properties and such should be reset.
3738  *
3739 
3740  GST_STATE_PAUSED: in this state, an element has opened the stream, but is not actively processing it. An element is allowed to
3741  modify a stream's position, read and process data and such to prepare for playback as soon as state is changed to PLAYING,
3742  but it is not allowed to play the data which would make the clock run. In summary, PAUSED is the same as PLAYING but without
3743  a running clock.
3744 
3745  Elements going into the PAUSED state should prepare themselves for moving over to the PLAYING state as soon as possible.
3746  Video or audio outputs would, for example, wait for data to arrive and queue it so they can play it right after the state change.
3747  Also, video sinks can already play the first frame (since this does not affect the clock yet). Autopluggers could use this same
3748  state transition to already plug together a pipeline. Most other elements, such as codecs or filters, do not need to explicitely
3749  do anything in this state, however.
3750  *
3751 
3752  GST_STATE_PLAYING: in the PLAYING state, an element does exactly the same as in the PAUSED state, except that the clock now runs.
3753 
3754 */
3755 
3756 
3757 
3758 /*
3759 typedef enum {
3760  GST_STATE_CHANGE_FAILURE = 0,
3761  GST_STATE_CHANGE_SUCCESS = 1,
3762  GST_STATE_CHANGE_ASYNC = 2,
3763  GST_STATE_CHANGE_NO_PREROLL = 3
3764 } GstStateChangeReturn;
3765 */
3766 
3767 bool
3768 moGsGraph::CheckState( moGstStateChangeReturn state_change_result, bool waitforsync) {
3769 
3770 
3771  GstStateChangeReturn Gstate_change_result = (GstStateChangeReturn)state_change_result;
3772 
3773  if (!waitforsync)
3774  switch(Gstate_change_result) {
3775  case GST_STATE_CHANGE_FAILURE:
3776  //MODebug2->Push(moText("GST_STATE_CHANGE_FAILURE"));
3777  return false;
3778  break;
3779  case GST_STATE_CHANGE_SUCCESS:
3780  //MODebug2->Push(moText("GST_STATE_CHANGE_SUCCESS"));
3781  return true;
3782  break;
3783  case GST_STATE_CHANGE_ASYNC:
3784  //MODebug2->Push(moText("GST_STATE_CHANGE_ASYNC"));
3785  return true;
3786  break;
3787  case GST_STATE_CHANGE_NO_PREROLL:
3788  //MODebug2->Push(moText("GST_STATE_CHANGE_NO_PREROLL"));
3789  return false;
3790  break;
3791  }
3792 
3793  GstStateChangeReturn state_wait;
3794  GstState current_state, pending_state;
3795  GstClockTime time_out = GST_CLOCK_TIME_NONE;
3796  time_out = GST_SECOND;
3797 
3798  while(waitforsync) {
3799  MODebug2->Message("while wait for sync");
3800  state_wait = gst_element_get_state(GST_ELEMENT (m_pGstPipeline),&current_state, &pending_state, time_out);
3801  MODebug2->Message("state_wait result: " + IntToStr( state_wait ) );
3802  switch(state_wait) {
3803  case GST_STATE_CHANGE_SUCCESS:
3804  waitforsync = false;
3805  MODebug2->Message("change success");
3806  return true;
3807  break;
3808  case GST_STATE_CHANGE_FAILURE:
3809  waitforsync = false;
3810  MODebug2->Message("change failure!");
3811  return false;
3812  break;
3813  case GST_STATE_CHANGE_ASYNC:
3814  waitforsync = true;
3815  MODebug2->Message("change async!");
3816  break;
3817  default:
3818  waitforsync = true;
3819  MODebug2->Message("waitforsync");
3820  break;
3821  /*
3822  case GST_STATE_CHANGE_NO_PREROLL:
3823  waitforsync = true;
3824  break;
3825  */
3826  }
3827  }
3828 
3829  return false;
3830 
3831 }
3832 
3834 
3835  GstStateChangeReturn state_wait;
3836  GstState current_state, pending_state;
3837  GstClockTime time_out = GST_CLOCK_TIME_NONE;
3838  time_out = GST_SECOND;
3839 
3840  GstPad* srcRGB = NULL;
3841  bool padactive = false;
3842  bool padlinked = false;
3843  bool padblocked = false;
3844  bool padblocking = false;
3845 
3846 
3847  if (m_pColorSpace) {
3848 #ifndef GSTVERSION
3849  srcRGB = gst_element_get_pad ( (GstElement*)m_pColorSpace, "src");
3850 #else
3851  srcRGB = gst_element_get_static_pad ( (GstElement*)m_pColorSpace, "src" );
3852 #endif
3853 
3854  padactive = gst_pad_is_active( srcRGB );
3855  padlinked = gst_pad_is_linked( srcRGB );
3856  padblocked = gst_pad_is_blocked( srcRGB );
3857  padblocking = gst_pad_is_blocking( srcRGB );
3858  }
3859 
3860  if (m_pGMainContext) {
3861  if (g_main_context_iteration( (GMainContext*)m_pGMainContext, false )) {
3862  //MODebug2->Message( moText("moGsGraph ::GetState (events)") );
3863  } else {
3864  //MODebug2->Message( moText("moGsGraph ::GetState (no events!!)"));
3865  }
3866  }
3867 /*
3868  MODebug2->Message( moText(" Position:")
3869  + IntToStr( this->GetPosition())
3870  //+ moText(" pad active: ")
3871  // + IntToStr((int)padactive)
3872  // + moText(" pad linked: ")
3873  // + IntToStr((int)padlinked)
3874  // + moText(" pad blocked: ")
3875  // + IntToStr((int)padblocked)
3876  // + moText(" pad blocking: ")
3877  // + IntToStr((int)padblocking)
3878  );
3879 */
3880  //MODebug2->Message( moText("moGsGraph ::GetState > gst_element_get_state"));
3881  state_wait = gst_element_get_state(GST_ELEMENT (m_pGstPipeline),&current_state, &pending_state, time_out);
3882  /*g_main_context_iteration
3883  GST_STATE_VOID_PENDING = 0,
3884  GST_STATE_NULL = 1,
3885  GST_STATE_READY = 2,
3886  GST_STATE_PAUSED = 3,
3887  GST_STATE_PLAYING = 4
3888  */
3889 
3890  switch(current_state) {
3891  case GST_STATE_VOID_PENDING:
3892  //MODebug2->Message( moText("moGsGraph ::GetState GST_STATE_VOID_PENDING"));
3893  return MO_STREAMSTATE_UNKNOWN;
3894  break;
3895  case GST_STATE_NULL:
3896  //MODebug2->Message( moText("moGsGraph ::GetState GST_STATE_NULL"));
3897  return MO_STREAMSTATE_STOPPED;
3898  break;
3899  case GST_STATE_READY:
3900  //MODebug2->Message( moText("moGsGraph ::GetState GST_STATE_READY"));
3901  return MO_STREAMSTATE_READY;
3902  break;
3903  case GST_STATE_PAUSED:
3904  //MODebug2->Message( moText("moGsGraph ::GetState GST_STATE_PAUSED"));
3905  return MO_STREAMSTATE_PAUSED;
3906  break;
3907  case GST_STATE_PLAYING:
3908  //MODebug2->Message( moText("moGsGraph ::GetState GST_STATE_PLAYING"));
3909  return MO_STREAMSTATE_PLAYING;
3910  break;
3911  }
3912 
3913  //MODebug2->Message( moText("moGsGraph ::GetState MO_STREAMSTATE_UNKNOWN"));
3914 
3915  return MO_STREAMSTATE_UNKNOWN;
3916 
3917 }
3918 
3919 //CONTROL METHODS
3920 void
3921 moGsGraph::Play() {
3922  /* start the pipeline */
3923  //MODebug2->Message(moText("moGsGraph::Play()"));
3924  //MODebug2->Message(moText("moGsGraph::Play( SetEOS)"));
3925  SetEOS(false);
3926  //MODebug2->Message(moText("moGsGraph::Play() calling CheckState -> GST_STATE_PLAYING"));
3927  CheckState( gst_element_set_state (GST_ELEMENT (m_pGstPipeline), GST_STATE_PLAYING), true );
3928  //MODebug2->Message(moText("moGsGraph::Play() returnin CheckState."));
3929 }
3930 
3931 void
3932 moGsGraph::Stop() {
3933  /*set state to NULL*/
3934  SetEOS(false);
3935  CheckState( gst_element_set_state (GST_ELEMENT (m_pGstPipeline), GST_STATE_NULL) );
3936  //moGsGraph::Pause();
3937 }
3938 
3939 void
3940 moGsGraph::Pause() {
3941 /*set state to NULL*/
3944  MODebug2->Message( "moGsGraph::Pause() pausing" );
3945  int b = 2;
3946  while(b!=0) {
3947  b = gst_element_is_locked_state (GST_ELEMENT (m_pGstPipeline));
3948  MODebug2->Message( moText("moGsGraph::Pause() check locked ") + IntToStr(b) );
3949  }
3950  MODebug2->Message( moText("moGsGraph::Pause() set state"));
3951  GstStateChangeReturn st = gst_element_set_state (GST_ELEMENT (m_pGstPipeline), GST_STATE_PAUSED);
3952  MODebug2->Message( moText("moGsGraph::Pause() check state ") + IntToStr((int)st) );
3953  CheckState( st, true );
3954  MODebug2->Message( "moGsGraph::Pause() passed" );
3955 
3956  }
3957 }
3958 
3959 #define MO_INFINITE -1
3960 
3964 void
3965 moGsGraph::Seek( MOuint frame, float rate ) {
3966 
3967  gint64 time_nanoseconds;
3968  bool res;
3969  rate = 1.0;
3970  //MODebug2->Message(moText("moGsGraph :: Seeking:") + IntToStr(frame) );
3971 
3973 
3976  if ( (GetState()==MO_STREAMSTATE_PAUSED) && frame >= (m_FramesLength - 1) ) {
3977  frame = m_FramesLength - 1;
3978  }
3979 
3980  time_nanoseconds = (gint64) frame * m_VideoFormat.m_TimePerFrame;
3981  //MODebug2->Message(" Seeking frame: " + IntToStr(frame) + " time (ns): " + IntToStr(time_nanoseconds) + " timeperframe:" + IntToStr(m_VideoFormat.m_TimePerFrame) );
3982  //cout << "seeking frame:" << frame << " in " << time_nanoseconds << endl;
3983  /*res = gst_element_seek (m_pGstPipeline, 1.0, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH,
3984  GST_SEEK_TYPE_SET, time_nanoseconds,
3985  GST_SEEK_TYPE_NONE, GST_CLOCK_TIME_NONE);
3986  */
3987  res = gst_element_seek_simple(
3988  (GstElement*)m_pGstPipeline,
3989  GST_FORMAT_TIME,
3990  (GstSeekFlags)(
3991  GST_SEEK_FLAG_FLUSH
3992  | GST_SEEK_FLAG_KEY_UNIT
3993  //| GST_SEEK_FLAG_ACCURATE
3994  ),
3995  time_nanoseconds );
3996  //cout << "success:" << res << endl;
3997  //this->Pause();
3998  } else {
4000  time_nanoseconds = frame * GST_MSECOND;
4001  res = gst_element_seek_simple( (GstElement*)m_pGstPipeline, GST_FORMAT_TIME, (GstSeekFlags)(GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_KEY_UNIT ), time_nanoseconds );
4002  if (res!=true) {
4003  MODebug2->Error("moGsGraph :: Seek (time) error");
4004  }
4008  }
4009 
4010 }
4011 
4012 
4013 MOulong
4015 
4016  GstFormat fmt = GST_FORMAT_TIME;
4017 
4018  gint64 len,lenF;
4019 #ifndef GSTVERSION
4020  if (gst_element_query_duration ((GstElement*)m_pGstPipeline, &fmt, &len)) {
4021 #else
4022  if (gst_element_query_duration ((GstElement*)m_pGstPipeline, fmt, &len)) {
4023 #endif
4024  /*g_print ("Time: %" GST_TIME_FORMAT " / %" GST_TIME_FORMAT "\r",
4025  GST_TIME_ARGS (pos), GST_TIME_ARGS (len));*/
4026  //if (m_VideoFormat.m_TimePerFrame) m_VideoFormat.m_TimePerFrame = 25;
4028  lenF = ( len / ( m_VideoFormat.m_TimePerFrame ) );
4029  //cout << "gsgraph: len: ns: " << len << " frames:" << m_FramesLength << endl;
4030  MODebug2->Message( "Total length (miliseconds):" + IntToStr(len/GST_MSECOND) + " (frames): " + IntToStr(lenF));
4031  m_FramesLength = lenF;
4032  return m_FramesLength;
4033  }
4034 
4035  return 0;
4036 }
4037 
4038 MOulong
4040  GstFormat fmt = GST_FORMAT_TIME;
4041 
4042  gint64 len,lenF;
4043 #ifndef GSTVERSION
4044  if (gst_element_query_duration ((GstElement*)m_pGstPipeline, &fmt, &len)) {
4045 #else
4046  if (gst_element_query_duration ((GstElement*)m_pGstPipeline, fmt, &len)) {
4047 #endif
4048  /*g_print ("Time: %" GST_TIME_FORMAT " / %" GST_TIME_FORMAT "\r",
4049  GST_TIME_ARGS (pos), GST_TIME_ARGS (len));*/
4051  lenF = ( len / ( m_AudioFormat.m_TimePerSample ) );
4052  //cout << "gsgraph: len: ns: " << len << " frames:" << m_FramesLength << endl;
4053  MODebug2->Message( "Total length (miliseconds):" + IntToStr(len/GST_MSECOND) + " (samples): " + IntToStr(lenF));
4054  m_SamplesLength = lenF;
4055  return m_SamplesLength;
4056  }
4057 
4058  return 0;
4059 }
4060 
4061 MOulong
4063 
4064  GstFormat fmt = GST_FORMAT_TIME;
4065 
4066  gint64 dur;
4067 
4068 #ifndef GSTVERSION
4069  if (gst_element_query_duration ((GstElement*)m_pGstPipeline, &fmt, &dur)) {
4070 #else
4071  if (gst_element_query_duration ((GstElement*)m_pGstPipeline, fmt, &dur)) {
4072 #endif
4073  m_Duration = GST_TIME_AS_MSECONDS(dur); //in milliseconds 1ms = 1 million ns
4074  //cout << "gsgraph: dur: ns: " << dur << endl;
4075  return m_Duration;
4076  }
4077 
4078  return 0;
4079 }
4080 
4083 
4084 MOulong
4086 
4087  GstFormat fmt = GST_FORMAT_TIME;
4088  gint64 pos,frame;
4089 
4090 #ifndef GSTVERSION
4091  if (gst_element_query_position ((GstElement*)m_pGstPipeline, &fmt, &pos)) {
4092 #else
4093  if (gst_element_query_position ((GstElement*)m_pGstPipeline, fmt, &pos)) {
4094 #endif
4095  if (m_VideoFormat.m_TimePerFrame==0) {
4096  return (pos / 1000000);
4097  }
4098  frame = pos / (gint64) m_VideoFormat.m_TimePerFrame;
4099  return (MOulong)frame;
4100  }
4101  return 0;
4102 }
4103 
4104 MOulong
4106 
4107  GstFormat fmt = GST_FORMAT_TIME;
4108  gint64 pos;
4109 
4110 #ifndef GSTVERSION
4111  if (gst_element_query_position ((GstElement*)m_pGstPipeline, &fmt, &pos)) {
4112 #else
4113  if (gst_element_query_position ((GstElement*)m_pGstPipeline, fmt, &pos)) {
4114 #endif
4115  return (MOulong)GST_TIME_AS_MSECONDS(pos);
4116  }
4117  return 0;
4118 }
4119 
4120 
4121 bool
4123  if (!m_pGstPipeline) return false;
4124  if (gst_element_get_state ((GstElement*)m_pGstPipeline, NULL, NULL, -1) == GST_STATE_CHANGE_FAILURE ) return false;
4125  return true;
4126 }
4127 
4128 void
4129 moGsGraph::SetVolume( float volume ) {
4130  if (m_pAudioVolume && m_bInitialized ) {
4131  g_object_set ( (GstElement*)m_pAudioVolume, "volume", volume, NULL);
4132  }
4133 }
4134 
4135 void
4136 moGsGraph::SetBalance( float balance ) {
4137  if (m_pAudioPanorama && m_bInitialized ) {
4138  g_object_set ( (GstElement*)m_pAudioPanorama, "panorama", balance, NULL);
4139  }
4140 }
4141 
4142 void
4143 moGsGraph::SetPitch( float pitch ) {
4144  if (m_pAudioSpeed && m_bInitialized ) {
4145  //Pause();
4146  g_object_set ( (GstElement*)m_pAudioSpeed, "speed", pitch, NULL);
4147  //Play();
4148  }
4149 }
4150 
4151 void
4152 moGsGraph::SetEchoDelay( float delay ) {
4153  unsigned long long delayl = delay;
4154  if (m_pAudioEcho && m_bInitialized ) {
4155  g_object_set ( (GstElement*)m_pAudioEcho, "delay", delayl, NULL);
4156  }
4157 }
4158 
4159 void
4160 moGsGraph::SetEchoIntensity( float intensity ) {
4161  if (m_pAudioEcho && m_bInitialized ) {
4162  g_object_set ( (GstElement*)m_pAudioEcho, "intensity", intensity, NULL);
4163  }
4164 }
4165 
4166 void
4167 moGsGraph::SetEchoFeedback( float feedback ) {
4168  if (m_pAudioEcho && m_bInitialized ) {
4169  g_object_set ( (GstElement*)m_pAudioEcho, "feedback", feedback, NULL);
4170  }
4171 }
4172 
4173 
4174 void moGsGraph::SetBrightness( float brightness ) {
4175  if (m_pVideoBalance && m_bInitialized ) {
4176  g_object_set ( (GstElement*)m_pVideoBalance, "brightness", brightness, NULL);
4177  }
4178 }
4179 
4180 
4181 
4182 void moGsGraph::SetContrast( float contrast ) {
4183  if (m_pVideoBalance && m_bInitialized ) {
4184  g_object_set ( (GstElement*)m_pVideoBalance, "contrast", contrast, NULL);
4185  }
4186 }
4187 
4188 
4189 
4190 void moGsGraph::SetHue( float hue ) {
4191  if (m_pVideoBalance && m_bInitialized ) {
4192  g_object_set ( (GstElement*)m_pVideoBalance, "hue", hue, NULL);
4193  }
4194 }
4195 
4196 
4197 
4198 void moGsGraph::SetSaturation( float saturation ) {
4199  if (m_pVideoBalance && m_bInitialized ) {
4200  g_object_set ( (GstElement*)m_pVideoBalance, "saturation", saturation, NULL);
4201  }
4202 }
4203 
4204 
4205 
4206 MObyte *
4209  #ifdef USING_SYNC_FRAMEBUFFER
4210  size = NULL;
4211  GstAppSink* psink;
4212  GstSample* sample;
4213  psink = (GstAppSink*) m_pFakeSink;
4214  if (psink && !this->m_VideoFormat.m_WaitForFormat) {
4215  //GstSample* sample = gst_app_sink_try_pull_sample ( psink, 1000000000 );
4216  //GstSample* sample = gst_app_sink_pull_sample ( psink);
4217  sample = gst_app_sink_pull_sample ( psink);
4218  GstMapInfo mapinfo;
4219  int w = m_VideoFormat.m_Width;
4220  int h = m_VideoFormat.m_Height;
4221  moBucket *pbucket=NULL;
4222  GstCaps* bcaps = gst_sample_get_caps( sample );
4223  if (!bcaps) return NULL;
4224 
4225  GstBuffer* Gbuffer = gst_sample_get_buffer (sample);
4226  int bsize = gst_buffer_get_size( Gbuffer );
4227  if (!( bsize>0 && (int)bsize<=(h*w*4) )) return NULL;
4228 
4229  if (!m_pBucketsPool) return NULL;
4230  if(m_pBucketsPool->IsFull()) {
4231  gst_sample_unref(sample);
4232  return NULL;
4233  }
4234 
4235  pbucket = new moBucket();
4236  if (pbucket==NULL) return NULL;
4237 
4238  if (Gbuffer) {
4239  gst_buffer_map ( Gbuffer, &mapinfo, GST_MAP_READ);
4240  if (bsize) {
4241  //pGsGraph->MODebug2->Message(moText("copying: ") + IntToStr(bsize) );
4242  //pGsGraph->m_Buckets[0].Copy( bsize, (MOubyte*)mapinfo.data );
4243  pbucket->SetBuffer( bsize,(MOubyte*)mapinfo.data );
4244  //pbucket->BuildBucket(bsize,128);
4245  } else {
4246  //MODebug2->Error(moText("m_Buckets size: ") + IntToStr(pGsGraph->m_Buckets[0].GetSize()) + moText(" do not match with buffer size: ") + IntToStr(bsize) );
4247  }
4248  gst_buffer_unmap ( Gbuffer, &mapinfo );
4249 
4250  bool added_bucket = m_pBucketsPool->AddBucket( pbucket );
4251  if(!added_bucket)
4252  MODebug2->Error(moText("appsink_new_sample > Bucket error"));
4253  gst_sample_unref(sample);
4254 
4255  }
4256  }
4257 #endif
4258  return NULL;
4259 }
4260 
4261 
4262 /*GET SOURCES FROM FILTER*/
4263 /*
4264 GstElement* tee;
4265 GstPad * pad;
4266  gchar *name;
4267 
4268  pad = gst_element_get_request_pad (tee, "src%d");
4269  name = gst_pad_get_name (pad);
4270  g_print ("A new pad %s was created\n", name);
4271  g_free (name);
4272 */
4273 
4274 
4275 /*CONNECTING TOW PINS OR PADS WITH SOME CAPABLITIES SET*/
4276 /*
4277  gboolean link_ok;
4278  GstCaps *caps;
4279 
4280  caps = gst_caps_new_full (
4281  gst_structure_new ("video/x-raw-yuv",
4282  "width", G_TYPE_INT, 384,
4283  "height", G_TYPE_INT, 288,
4284  "framerate", GST_TYPE_FRACTION, 25, 1,
4285  NULL),
4286  gst_structure_new ("video/x-raw-rgb",
4287  "width", G_TYPE_INT, 384,
4288  "height", G_TYPE_INT, 288,
4289  "framerate", GST_TYPE_FRACTION, 25, 1,
4290  NULL),
4291  NULL);
4292 
4293  link_ok = gst_element_link_filtered (element1, element2, caps);
4294  gst_caps_unref (caps);
4295 
4296  if (!link_ok) {
4297  g_warning ("Failed to link element1 and element2!");
4298  }
4299 */
4300 
4301 
4302 
4303 /*
4304 #include <gst/gst.h>
4305 
4306 [.. my_bus_callback goes here ..]
4307 
4308 static gboolean
4309 idle_exit_loop (gpointer data)
4310 {
4311  g_main_loop_quit ((GMainLoop *) data);
4312 
4313 
4314  return FALSE;
4315 }
4316 
4317 static void
4318 cb_typefound (GstElement *typefind,
4319  guint probability,
4320  GstCaps *caps,
4321  gpointer data)
4322 {
4323  GMainLoop *loop = data;
4324  gchar *type;
4325 
4326  type = gst_caps_to_string (caps);
4327  g_print ("Media type %s found, probability %d%%\n", type, probability);
4328  g_free (type);
4329 
4330  g_idle_add (idle_exit_loop, loop);
4331 }
4332 
4333 gint
4334 main (gint argc,
4335  gchar *argv[])
4336 {
4337  GMainLoop *loop;
4338  GstElement *pipeline, *filesrc, *typefind;
4339  GstBus *bus;
4340 
4341  //
4342  gst_init (&argc, &argv);
4343  loop = g_main_loop_new (NULL, FALSE);
4344 
4345  //
4346  if (argc != 2) {
4347  g_print ("Usage: %s <filename>\n", argv[0]);
4348  return -1;
4349  }
4350 
4351 
4352  pipeline = gst_pipeline_new ("pipe");
4353 
4354  bus = gst_pipeline_get_bus (GST_PIPELINE (pipeline));
4355  gst_bus_add_watch (bus, my_bus_callback, NULL);
4356  gst_object_unref (bus);
4357 
4358 
4359  filesrc = gst_element_factory_make ("filesrc", "source");
4360  g_object_set (G_OBJECT (filesrc), "location", argv[1], NULL);
4361  typefind = gst_element_factory_make ("typefind", "typefinder");
4362  g_signal_connect (typefind, "have-type", G_CALLBACK (cb_typefound), loop);
4363 
4364 
4365  gst_bin_add_many (GST_BIN (pipeline), filesrc, typefind, NULL);
4366  gst_element_link (filesrc, typefind);
4367  gst_element_set_state (GST_ELEMENT (pipeline), GST_STATE_PLAYING);
4368  g_main_loop_run (loop);
4369 
4370 
4371  gst_element_set_state (GST_ELEMENT (pipeline), GST_STATE_NULL);
4372  gst_object_unref (GST_OBJECT (pipeline));
4373 
4374  return 0;
4375 }
4376 
4377 */
4378 #endif
moGstElement * m_pAudioConverter4
Definition: moGsGraph.h:344
moAudioFormat m_AudioFormat
Formato de video.
Definition: moVideoGraph.h:762
MOuint m_SampleRate
Definition: moVideoGraph.h:275
void SetEchoIntensity(float intensity)
virtual moStreamState GetState()
Estado de la reproducción.
MOuint m_TimePerSample
Definition: moVideoGraph.h:276
MOuint m_GreenMask
Definition: moVideoGraph.h:212
bool moGBoolean
Definition: moGsGraph.h:70
int GetSourceFlipH() const
Devuelve el valor de inversión de imagen horizontal.
Definition: moVideoGraph.h:448
#define DECODEBIN
Definition: moGsGraph.cpp:53
moGstElement * m_pAudioConverter
Definition: moGsGraph.h:341
#define MOulong
Definition: moTypes.h:392
bool BuildLiveVideoGraph(moText filename, moBucketsPool *pBucketsPool)
Grafo de reproducción de video en modo vivo, asyncronicamente reproducido en función del clock...
void SetAudioFormat(moGstCaps *caps, moGstBuffer *buffer=NULL)
moGstElement * m_pColorSpace
Definition: moGsGraph.h:318
moVideoFormat GetVideoFormat()
Devuelve el formato de video.
moGstElement * m_pAudioVolume
volume: volume, 0 - 10 [1]
Definition: moGsGraph.h:352
#define GSTVERSION
Definition: moGsGraph.cpp:41
int GetSourceHeight() const
Devuelve el alto de la imagen de origen.
Definition: moVideoGraph.h:438
moGstPad * m_pRTSPDepaySink
Definition: moGsGraph.h:363
void Error(moText p_text)
Anuncia y registra un error.
Definition: moAbstract.cpp:79
moGstElement * m_pRTSPSource
Definition: moGsGraph.h:312
bool DestroyRetreivedBucket()
Definition: moBuckets.cpp:246
#define VIDEOCONVERT
Definition: moGsGraph.cpp:54
moGstElement * m_pAudioPanorama
stereo balance-1 a 1: panorama
Definition: moGsGraph.h:346
void ToLower()
Definition: moText.cpp:632
void Stop()
Detener la reproducción del video.
MOuint m_BufferSize
Definition: moVideoGraph.h:208
Definition: moLock.h:50
void moGstAppSink
Definition: moGsGraph.h:76
moGstElement * m_pAudioSink
Definition: moGsGraph.h:338
void SetPitch(float pitch)
virtual moCaptureDevices * LoadCaptureDevices()
Carga los dispositivos de video disponibles.
MOuint m_RedMask
Definition: moVideoGraph.h:211
moGMainContext * m_pGMainContext
Definition: moGsGraph.h:371
moGstElement * m_pVideoDeinterlace
Definition: moGsGraph.h:325
moColorMode m_ColorMode
Definition: moVideoGraph.h:202
moGstElement * m_pHTTPSource
Definition: moGsGraph.h:314
moGstPad * m_pAudioPad
Definition: moGsGraph.h:360
virtual ~moGsFramework()
void * moGPointer
Definition: moGsGraph.h:68
void SetContrast(float contrast)
bool Unlock()
Libera el acceso al buffer interno.
Definition: moBuckets.cpp:49
MOuint m_TimePerFrame
Definition: moVideoGraph.h:206
long signal_handoff_id
Definition: moGsGraph.h:285
void SetVideoMode()
Definition: moVideoGraph.h:173
MOuint m_BufferSize
Definition: moVideoGraph.h:280
static void on_rtsppadd_added(moGstElement *rtspsrc, moGstPad *pad, moGPointer u_data)
void Seek(MOuint frame, float rate=1.0)
Busca y posiciona.
bool Lock()
Paraliza el acceso al buffer interno.
Definition: moBuckets.cpp:45
#define MObyte
Definition: moTypes.h:400
int moGstStateChangeReturn
Definition: moGsGraph.h:71
moGstPad * m_pVideoPad
Definition: moGsGraph.h:359
moGstElement * m_pAudioConverter3
Definition: moGsGraph.h:343
bool CheckState(moGstStateChangeReturn state_change_result, bool waitforsync=false)
MOulong moGetTicksAbsolute(bool force_real_absolute)
Devuelve en milisegundos el valor del reloj de Moldeo.
Definition: moTimer.cpp:15
MOuint Length() const
Definition: moText.cpp:347
void RetreivePads(moGstElement *FilterElement)
int moGstFlowReturn
Definition: moGsGraph.h:77
bool m_bEOS
Definition: moGsGraph.h:377
MOulong m_FramesLength
Definition: moGsGraph.h:374
const moText & GetName() const
Devuelve el nombre del dispositivo.
Definition: moVideoGraph.h:380
MOulong GetSamplesLength()
bool BuildLiveGraph(moBucketsPool *pBucketsPool, moCaptureDevice p_capdev)
Grafo de captura de video.
bool BuildLiveStreamingGraph(moBucketsPool *pBucketsPool, moText p_location)
virtual ~moGsGraph()
Destructor.
bool BuildLiveSound(moText filename)
MOboolean Exists()
Definition: moFile.cpp:436
moGstElement * m_pMultiplexer
Definition: moGsGraph.h:335
virtual MOulong GetPositionMS()
static moGBoolean cb_buffer_disconnected(moGPointer u_data)
Definition: moGsGraph.cpp:162
MOboolean m_WaitForFormat
Definition: moVideoGraph.h:282
moVideoFormat & GetVideoFormat()
Devuelve el formato de video del dispositivo.
Definition: moVideoGraph.h:400
clase de para manejar textos
Definition: moText.h:75
moGstElement * m_pEncoder
Definition: moGsGraph.h:334
MOuint m_Channels
Definition: moVideoGraph.h:271
#define MOlong
Definition: moTypes.h:391
MOulong m_SamplesLength
Definition: moGsGraph.h:375
static gboolean bus_call(GstBus *bus, GstMessage *msg, void *user_data)
Definition: moGsGraph.cpp:66
static void cb_pad_added(moGstElement *decodebin2, moGstPad *pad, moGPointer u_data)
Definition: moGsGraph.cpp:743
void moGstBuffer
Definition: moGsGraph.h:65
MOuint m_BlueMask
Definition: moVideoGraph.h:213
void moGstPadProbeInfo
Definition: moGsGraph.h:75
moGstElement * m_pAudioEcho
echo effect : audioecho > delay [ nanoseconds 10E-9, intensity, feedback ]
Definition: moGsGraph.h:348
moGstElement * m_pCapsFilter
Definition: moGsGraph.h:319
moGstElement * m_pAudioConverter2
Definition: moGsGraph.h:342
moText0 moText
Definition: moText.h:291
moGstBus * m_pGstBus
Definition: moGsGraph.h:368
virtual void SetEOS(bool iseos)
virtual bool CheckCaptureDevice(int i)
Chequea si el dispositivos de video disponible está aún disponible.
moBucket m_Buckets[100]
Definition: moGsGraph.h:295
moStreamState
Definition: moVideoGraph.h:140
void moGstElement
Definition: moGsGraph.h:62
void SetLabelName(const moText &p_labelname)
Fija el nombre de código del dispositivo.
Definition: moVideoGraph.h:423
bool IsRunning()
Está corriendo.
GStreamer Graph Class.
Definition: moGsGraph.h:151
void Present(bool p=true)
Fija la presencia del dispositivo.
Definition: moVideoGraph.h:413
void SetBrightness(float brightness)
moText0 & Right(MOuint)
Definition: moText.cpp:491
#define MOint
Definition: moTypes.h:388
static void cb_handoff(moGstElement *fakesrc, moGstBuffer *buffer, moGstPad *pad, moGPointer user_data)
const char * message
void moGMainContext
Definition: moGsGraph.h:73
moGstElement * m_pFakeSink
Definition: moGsGraph.h:330
virtual bool FinishGraph()
Finalización del grafo.
static void cb_newpad(moGstElement *decodebin, moGstPad *pad, moGBoolean last, moGPointer u_data)
moGstElement * m_pDecoderBin
Definition: moGsGraph.h:329
void Pause()
Pausa la reproducción del video.
int GetSourceWidth() const
Devuelve el ancho de la imagen de origen.
Definition: moVideoGraph.h:433
const moText & GetLabelName() const
Devuelve el nombre de código del dispositivo.
Definition: moVideoGraph.h:428
void SetEchoFeedback(float feedback)
bool BuildLiveWebcamGraph(moBucketsPool *pBucketsPool, moCaptureDevice &p_capdev)
void SetHue(float hue)
Administrador de moBucket &#39;s.
Definition: moBuckets.h:152
moGstElement * m_pJpegDecode
Definition: moGsGraph.h:316
moGstElement * m_pFileSink
Definition: moGsGraph.h:336
void WaitForFormatDefinition(MOulong timeout)
moGstElement * m_pVideoScale
Definition: moGsGraph.h:323
moBucketsPool * m_pBucketsPool
Definition: moGsGraph.h:300
virtual bool InitGraph()
Inicialización del grafo.
virtual MOulong GetPosition()
moGstElement * m_pFinalSource
Definition: moGsGraph.h:317
moGstElement * m_pAudioAmplify
audioamplify: amplification: -inf +inf (dangerous)
Definition: moGsGraph.h:354
virtual bool AddCaptureDevice(moCaptureDevice &p_capdev)
Agrega un dispositivo de video.
moGsFramework * m_pGsFramework
Definition: moGsGraph.h:303
moGstElement * m_pAudioSpeed
speed:pitch speed 0.1 - 40.0
Definition: moGsGraph.h:350
void moGstCaps
Definition: moGsGraph.h:66
static moDebug * MODebug2
Clase de impresión de errores para depuración.
Definition: moAbstract.h:225
moGstElement * m_pVideoFlip
Definition: moGsGraph.h:324
virtual moCaptureDevices * UpdateCaptureDevices()
Actualiza los dispositivos de video disponibles.
moGstElement * m_pVideoBalance
Definition: moGsGraph.h:326
Definición de un dispositivo de video, generalmente uno de captura de video, o camara.
Definition: moVideoGraph.h:336
Espacio en memoria para compartir datos entre objetos.
Definition: moBuckets.h:53
void Push(moText p_text)
Apila el mensaje dentro de la pila de mensajes.
Definition: moAbstract.h:115
#define MO_INFINITE
MOulong m_Duration
Definition: moGsGraph.h:373
MOulong GetFramesLength()
La cantidad de frames, el largo del stream.
void SetVolume(float volume)
moGMainLoop * m_pGMainLoop
Definition: moGsGraph.h:370
moGsGraph()
Constructor.
virtual bool IsEOS()
MObyte * GetFrameBuffer(MOlong *size)
int GetSourceBpp() const
Devuelve los bits por pixel de la imagen de origen.
Definition: moVideoGraph.h:443
moGstElement * m_pTypeFind
Definition: moGsGraph.h:320
void SetName(const moText &p_name)
Definition: moVideoGraph.h:375
void moGstPad
Definition: moGsGraph.h:63
bool BuildLiveQTVideoGraph(moText filename, moBucketsPool *pBucketsPool)
moGstElement * m_pMultipartDemux
Definition: moGsGraph.h:315
int moGstCallbackReturn
Definition: moGsGraph.h:74
void Play()
Reproducir el video.
bool BuildRecordGraph(moText filename, moBucketsPool *pBucketsPool)
Grafo de grabación.
MOuint m_SampleSize
Definition: moVideoGraph.h:274
moGstElement * m_pGstPipeline
Definition: moGsGraph.h:308
#define MOuint
Definition: moTypes.h:387
MOlong GetSize()
Devuelve el tamaño en bytes asignado por el buffer.
Definition: moBuckets.cpp:53
LIBMOLDEO_API moText0 IntToStr(int a)
Definition: moText.cpp:1070
MOubyte * GetBuffer()
Devuelve el puntero al buffer de datos.
Definition: moBuckets.cpp:58
void SetSaturation(float saturation)
moGstElement * m_pColorSpaceInterlace
Definition: moGsGraph.h:327
moBucket * RetreiveBucket()
Definition: moBuckets.cpp:207
MOuint m_BitCount
Definition: moVideoGraph.h:207
MOboolean m_WaitForFormat
Definition: moVideoGraph.h:210
moGstElement * m_pFakeSource
Definition: moGsGraph.h:333
MOuint m_FrameRate
Definition: moVideoGraph.h:209
int GetSourceFlipV() const
Devuelve el valor de inversión de imagen vertical.
Definition: moVideoGraph.h:453
void SetBuffer(MOlong size, MOubyte *pbuf)
Crea un espacio de memoria y asigna los valores desde un puntero a otro espacio de memoria...
Definition: moBuckets.cpp:87
void CopyVideoFrame(void *bufferdst, int size)
void SetEchoDelay(float delay)
void SetVideoFormat(moGstCaps *caps, moGstBuffer *buffer=NULL)
MOboolean m_bInitialized
Valor de inicialización.
Definition: moAbstract.h:223
int Find(const moText0 &target)
divide el texto separado por el caracter especificado
Definition: moText.cpp:683
void BuildAudioFilters()
bool SetCaptureDevice(moText deviceport, MOint idevice=0)
long m_BusWatchId
Definition: moGsGraph.h:278
moVideoFormat m_VideoFormat
Definition: moVideoGraph.h:761
virtual MOulong GetDuration()
La duración total del stream en nanosegundos.
void Message(moText p_text)
Anuncia un mensaje al usuario además de guardarlo en el log de texto.
Definition: moAbstract.cpp:114
long cb_have_data_handler_id
Definition: moGsGraph.h:250
bool BuildLiveDVGraph(moBucketsPool *pBucketsPool, moCaptureDevice &p_capdev)
#define MOubyte
Definition: moTypes.h:399
void SetBalance(float balance)
Fija el balance entre canal izquierdo y derecho en caso de haberlos.
moLock BuildLock
Definition: moGsGraph.cpp:44
static moGBoolean cb_have_data(moGstPad *pad, moGstBuffer *buffer, moGPointer u_data)
Definition: moGsGraph.cpp:318
bool AddBucket(moBucket *pBucket)
Definition: moBuckets.cpp:150
long signal_newpad_id
Definition: moGsGraph.h:277
long signal_rtsppad_added_id
Definition: moGsGraph.h:255
moGstElement * m_pFileSource
Definition: moGsGraph.h:311
const moText & GetPath() const
Devuelve el camino al dispositivo.
Definition: moVideoGraph.h:390
moGstElement * m_pIdentity
Definition: moGsGraph.h:321
moText m_Path
Descripción del dispositivo.
Definition: moVideoGraph.h:471
moGstElement * m_pRTSPDepay
Definition: moGsGraph.h:313
void BuildBucket(MOlong size, int setvalue)
Habilita el buffer en memoria con el valor prefijado.
Definition: moBuckets.cpp:77