JeVoisBase  1.5
JeVois Smart Embedded Machine Vision Toolkit Base Modules
Share this page:
DemoCPUGPU.C
Go to the documentation of this file.
1 // ///////////////////////////////////////////////////////////////////////////////////////////////////////////////////
2 //
3 // JeVois Smart Embedded Machine Vision Toolkit - Copyright (C) 2016 by Laurent Itti, the University of Southern
4 // California (USC), and iLab at USC. See http://iLab.usc.edu and http://jevois.org for information about this project.
5 //
6 // This file is part of the JeVois Smart Embedded Machine Vision Toolkit. This program is free software; you can
7 // redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software
8 // Foundation, version 2. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY;
9 // without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public
10 // License for more details. You should have received a copy of the GNU General Public License along with this program;
11 // if not, write to the Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
12 //
13 // Contact information: Laurent Itti - 3641 Watt Way, HNB-07A - Los Angeles, CA 90089-2520 - USA.
14 // Tel: +1 213 740 3527 - itti@pollux.usc.edu - http://iLab.usc.edu - http://jevois.org
15 // ///////////////////////////////////////////////////////////////////////////////////////////////////////////////////
16 /*! \file */
17 
18 #include <jevois/Core/Module.H>
19 #include <jevois/Types/Enum.H>
20 
24 
26 #include <linux/videodev2.h>
27 
28 #include <opencv2/imgproc/imgproc.hpp>
29 #include <future>
30 
31 // icon by Freepik in computer at flaticon
32 
33 //! Simple image filtering demo using 4-core CPU processing and OpenGL-ES 2.0 shaders on the Mali-400MP2 GPU
34 /*! In this demo, we compute saliency and gist over our 4 CPU cores while we also compute 4 different image filters over
35  the GPU, finally combining all results into a single grayscale image:
36 
37  - saliency: multicore CPU-based detection of the most conspicuous (most attention-grabbing) object in the field of
38  view.
39  - GPU filter 1: Sobel edge detector
40  - GPU filter 2: Median filter
41  - GPU filter 3: Morphological erosion filter
42  - GPU filter 4: Morphological dilation filter
43 
44  For an introduction to visual saliency, see http://ilab.usc.edu/bu/
45 
46  Video output
47  ------------
48 
49  The video output is arranged vertically, with, from top to bottom:
50  - Sobel results (same size as input image)
51  - Median filter results (same size as input image)
52  - Morphological erosion filter results (same size as input image)
53  - Morphological dilation filter results (same size as input image)
54  - Saliency results: from left to right: saliency map, color map, intensity map, orientation map, flicker map, motion
55  map. Map size is input size divided by 8 horizontally and vertically. Gist vector is appended to the right but
56  note that at 160x120 it is truncated (see source code for details).
57 
58  Serial Messages
59  ---------------
60 
61  This module can send standardized serial messages as described in \ref UserSerialStyle, where all coordinates and
62  sizes are standardized using \ref coordhelpers. One message is issued on every video frame at the temporally
63  filtered attended (most salient) location:
64 
65  - Serial message type: \b 2D
66  - `id`: always \b sm (shorthand for saliency map)
67  - `x`, `y`: standardized 2D coordinates of temporally-filtered most salient point
68  - `w`, `h`: always 0, 0
69  - `extra`: none (empty string)
70 
71 
72  @author Laurent Itti
73 
74  @displayname Demo CPU GPU
75  @videomapping GREY 160 495 60.0 YUYV 160 120 60.0 JeVois DemoCPUGPU
76  @email itti\@usc.edu
77  @address University of Southern California, HNB-07A, 3641 Watt Way, Los Angeles, CA 90089-2520, USA
78  @copyright Copyright (C) 2016 by Laurent Itti, iLab and the University of Southern California
79  @mainurl http://jevois.org
80  @supporturl http://jevois.org/doc
81  @otherurl http://iLab.usc.edu
82  @license GPL v3
83  @distribution Unrestricted
84  @restrictions None
85  \ingroup modules */
87 {
88  public:
89  //! Constructor
90  DemoCPUGPU(std::string const & instance) : jevois::StdModule(instance)
91  {
92  itsFilter = addSubComponent<FilterGPU>("gpu");
93  itsSaliency = addSubComponent<Saliency>("saliency");
94  itsKF = addSubComponent<Kalman2D>("kalman");
95 
96  // Use some fairly large saliency and feature maps so we can see them:
97  itsSaliency->centermin::set(1);
98  itsSaliency->smscale::set(3);
99  }
100 
101  //! Virtual destructor for safe inheritance
102  virtual ~DemoCPUGPU() { }
103 
104  //! Set our GPU program after we are fully constructed and our Component path has been set
105  void postInit() override
106  {
107  itsFilter->setProgram("shaders/simplevertshader.glsl", "shaders/combofragshader.glsl");
108  itsFilter->setProgramParam2f("offset", -1.0F, -1.0F);
109  itsFilter->setProgramParam2f("scale", 2.0F, 2.0F);
110  }
111 
112  //! Processing function
113  virtual void process(jevois::InputFrame && inframe, jevois::OutputFrame && outframe) override
114  {
115  // Wait for next available camera image:
116  jevois::RawImage const inimg = inframe.get(); unsigned int const w = inimg.width, h = inimg.height;
117  inimg.require("input", w, h, V4L2_PIX_FMT_YUYV); // accept any image size but require YUYV pixels
118 
119  // In this demo, the GPU completes earlier than the CPU. Hence, we are going to wait for the output frame in the
120  // main thread (which runs the GPU code). We use a mutex to signal to the saliency thread when the output image is
121  // available. Using a mutex and unique_lock here ensures that we will be exception-safe (which may be trickier
122  // with a condition variable or such):
123  jevois::RawImage outimg;
124  std::unique_lock<std::mutex> lck(itsOutMtx); // mutex will be released by main thread when outimg is available
125 
126  // Launch the saliency computation in a thread:
127  auto sal_fut = std::async(std::launch::async, [&](){
128  // Compute saliency and gist:
129  itsSaliency->process(inimg, true);
130 
131  // Find most salient point:
132  int mx, my; intg32 msal; itsSaliency->getSaliencyMax(mx, my, msal);
133 
134  // Compute attended location in original frame coordinates:
135  int const smlev = itsSaliency->smscale::get();
136  int const smadj = smlev > 0 ? (1 << (smlev-1)) : 0; // half a saliency map pixel adjustment
137  unsigned int const dmx = (mx << smlev) + smadj;
138  unsigned int const dmy = (my << smlev) + smadj;
139  unsigned int const mapw = itsSaliency->salmap.dims.w, maph = itsSaliency->salmap.dims.h;
140  unsigned int const gistsize = itsSaliency->gist_size;
141  unsigned int const gistw = w - 6 * mapw; // width for gist block
142  unsigned int const gisth = (gistsize + gistw - 1) / gistw; // divide gist_size by w and round up
143 
144  // Filter over time the salient location coordinates:
145  itsKF->set(dmx, dmy, w, h);
146  float kfxraw, kfyraw; itsKF->get(kfxraw, kfyraw, 1.0F); // round to int for serial
147 
148  // Send kalman-filtered most-salient-point info to serial port (for arduino, etc):
149  sendSerialStd2D(kfxraw, kfyraw, 0.0F, 0.0F, "sm");
150 
151  // Wait for output image to be available:
152  std::lock_guard<std::mutex> _(itsOutMtx);
153 
154  // Paste saliency and feature maps:
155  unsigned int offset = 0;
156  pasteGrayMap(outimg, itsSaliency->salmap, offset, h*4, 20);
157  pasteGrayMap(outimg, itsSaliency->color, offset, h*4, 18);
158  pasteGrayMap(outimg, itsSaliency->intens, offset, h*4, 18);
159  pasteGrayMap(outimg, itsSaliency->ori, offset, h*4, 18);
160  pasteGrayMap(outimg, itsSaliency->flicker, offset, h*4, 18);
161  pasteGrayMap(outimg, itsSaliency->motion, offset, h*4, 18);
162 
163  // Paste gist. Note: at 160x120 we will only end up pasting the first 600 gist entries. This code may fail at
164  // different resolutions, beware:
165  unsigned char * d = outimg.pixelsw<unsigned char>() + 4*w*h + 6*mapw;
166  for (int i = 0; i < maph; ++i) memcpy(d + i*w, itsSaliency->gist + i*gistw, gistw);
167  });
168 
169  // Convert input image to grayscale:
170  cv::Mat grayimg = jevois::rawimage::convertToCvGray(inimg);
171 
172  // Once saliency is done using the input image, let camera know we are done with it:
173  itsSaliency->waitUntilDoneWithInput();
174  inframe.done();
175 
176  // Upload the greyscale image to GPU and apply 4 grayscale GPU filters, storing the 4 results into an RGBA image:
177  cv::Mat gpuout(h, w, CV_8UC4);
178  itsFilter->process(grayimg, gpuout);
179 
180  // Wait for an image from our gadget driver into which we will put our results:
181  outimg = outframe.get();
182  lck.unlock(); // let saliency thread proceed with using the output image
183  outimg.require("output", w, h * 4 + (h >> itsSaliency->smscale::get()), V4L2_PIX_FMT_GREY);
184 
185  // Unpack the GPU output into 4 gray images, starting from top-left of the output image:
187 
188  // Wait until saliency computation is complete:
189  sal_fut.get();
190 
191  // Send the output image with our processing results to the host over USB:
192  outframe.send();
193  }
194 
195  // ####################################################################################################
196  //! Paste a map and add its width to the dx offset
197  /*! Beware this is for a gray outimg only. */
198  void pasteGrayMap(jevois::RawImage & outimg, env_image const & fmap, unsigned int & dx, unsigned int dy,
199  unsigned int bitshift)
200  {
201  env_size_t const fw = fmap.dims.w, fh = fmap.dims.h;
202  unsigned int const ow = outimg.width, oh = outimg.height;
203 
204  if (dy + fh > oh) LFATAL("Map would extend past output image bottom");
205  if (fw + dx > ow) LFATAL("Map would extend past output image width");
206 
207  unsigned int const stride = ow - fw;
208 
209  intg32 * s = fmap.pixels; unsigned char * d = outimg.pixelsw<unsigned char>() + dx + dy * ow;
210 
211  for (unsigned int j = 0; j < fh; ++j)
212  {
213  for (unsigned int i = 0; i < fw; ++i)
214  {
215  intg32 v = (*s++) >> bitshift; if (v > 255) v = 255;
216  *d++ = (unsigned char)(v);
217  }
218  d += stride;
219  }
220 
221  // Add the map width to the dx offset:
222  dx += fw;
223  }
224 
225  private:
226  std::shared_ptr<FilterGPU> itsFilter;
227  std::shared_ptr<Saliency> itsSaliency;
228  std::shared_ptr<Kalman2D> itsKF;
229  std::mutex itsOutMtx;
230 };
231 
232 // Allow the module to be loaded as a shared object (.so) file:
Simple image filtering demo using 4-core CPU processing and OpenGL-ES 2.0 shaders on the Mali-400MP2 ...
Definition: DemoCPUGPU.C:86
env_size_t w
The width.
Definition: env_types.h:82
unsigned int height
JEVOIS_REGISTER_MODULE(DemoCPUGPU)
Basic image class.
Definition: env_image.h:43
StdModule(std::string const &instance)
struct env_dims dims
Definition: env_image.h:45
env_size_t h
The height.
Definition: env_types.h:83
void sendSerialStd2D(float x, float y, float w=0.0F, float h=0.0F, std::string const &id="", std::string const &extra="")
unsigned long env_size_t
Definition: env_types.h:71
DemoCPUGPU(std::string const &instance)
Constructor.
Definition: DemoCPUGPU.C:90
cv::Mat convertToCvGray(RawImage const &src)
intg32 * pixels
Definition: env_image.h:46
#define LFATAL(msg)
virtual void process(jevois::InputFrame &&inframe, jevois::OutputFrame &&outframe) override
Processing function.
Definition: DemoCPUGPU.C:113
ENV_INTG32_TYPE intg32
32-bit signed integer
Definition: env_types.h:52
virtual ~DemoCPUGPU()
Virtual destructor for safe inheritance.
Definition: DemoCPUGPU.C:102
void unpackCvRGBAtoGrayRawImage(cv::Mat const &src, RawImage &dst)
void postInit() override
Set our GPU program after we are fully constructed and our Component path has been set...
Definition: DemoCPUGPU.C:105
void pasteGrayMap(jevois::RawImage &outimg, env_image const &fmap, unsigned int &dx, unsigned int dy, unsigned int bitshift)
Paste a map and add its width to the dx offset.
Definition: DemoCPUGPU.C:198
unsigned int width
void require(char const *info, unsigned int w, unsigned int h, unsigned int f) const