4. Combining Video And Graphics - PowerPoint PPT Presentation

About This Presentation
Title:

4. Combining Video And Graphics

Description:

In our previous labs, the ferris wheel is built on a plane laid horizontally with grass texture ... Drive the car based on the live video received from the ... – PowerPoint PPT presentation

Number of Views:27
Avg rating:3.0/5.0
Slides: 25
Provided by: DrDani7
Category:

less

Transcript and Presenter's Notes

Title: 4. Combining Video And Graphics


1
4. Combining Video And Graphics
  • To obtain a first-person experience of a real car
    racing, a video camera has been installed on the
    Mindstorms robot car
  • Real time video will be sent thru a wireless
    channel to the video server
  • Software program should be developed to retrieve
    the video from the server to display on Ogre
  • Hence video and graphics are combined to achieve
    augmented reality

References 1. krssagar, Simultaneous Previewing
Video Capture using DirectShow
http//www.codeproject.com/KB/audio-video/DXCaptur
e.aspx 2. Ogre Tutorials, http//www.ogre3d.org/wi
ki/index.php/Ogre_Tutorials
2
The Video Server
  • To minimize the delay in controlling the car,
    video information is sent thru a 2.4GHz analogue
    channel from the car to the server
  • Properties of analogue video
  • Advantage minimum delay
  • Disadvantages small capacity, i.e. there cannot
    be too many video channels
  • Each car uses a different frequency (2.4GHz ?)
    to send the video information to the server
  • Due to the limitation of the available
    frequencies, at most 4 cars can send video
    information at the same time
  • Video information is received by a Receiver which
    is connected to the server via a USB port
  • User needs to install their program in the server
    to read the video information thru the USB port
    and display on the screen

3
The Video Server (cont)
  • Precaution since analogue video is used, the
    received video quality can be affected by many
    environmental factors, such as
  • Distance between the car and the receiver
  • Electrical appliances that generate a similar
    frequency

USB
2.4GHz analogue channel
Receivers
4
The Webcam Class
  • A class called Webcam has been pre-built to allow
    users program to retrieve video data
  • Users program only needs to add the library
    Webcam.lib in their project and include the head
    file Webcam.h

Video Server
Webcam.lib
class Webcam
USB
2.4GHz analogue channel
Receivers
5
The Webcam Class (cont)
  • The Webcam class provides functions for
    previewing and grabbing simultaneously from a
    webcam using DirectShow technology
  • DirectShow is a middleware architecture that
    provides a pipeline for media playback and
    capture
  • The DirectShow API enables playback of multimedia
    content from local storage or from streamed
    sources over a network, or the Internet
  • To use DirectShow, Microsoft Windows SDK has to
    be installed

6
The Webcam Class (cont)
class Webcam public Webcam(void) Webcam(voi
d) HRESULT Init(int iDeviceID, HWND hWnd,
int iWidth, int
iHeight) DWORD GetFrame(BYTE pFrame) DWORD
GrabFrame()
7
WebcamInit()
  • Hook up users program with, in this project, the
    USB port for receiving video data
  • Perform the initialization of the video capture
    function
  • Input parameters
  • iDeviceID is the selected device index. In this
    project, the number should range from 0 to 3
  • hWnd is the handle of the display window. In this
    project, since the video will not display on a
    Microsoft Window, so we can fill in NULL here
  • iWidth and iHeight are the size of the video
    frame. In this project, we set them as 320 and
    240, respectively
  • Return S_FALSE if everything fine

8
Webcam GrabFrame()
  • GrabFrame() will grab a frame of image from the
    camera and store it in an internal buffer
  • Input parameter Nil
  • Return value is the size of the buffer

9
WebcamGetFrame()
  • To obtain the address of the image data buffer
  • Input parameter
  • pFrame is the address of a memory location for
    storing the address of the image data buffer
  • Return the size of the buffer, which should be
    the same returned by GrabFrame()

BYTE pImage // Define pImage is a BYTE
pointer // but has not initialized its
value DWORD bufferSize GetFrame(pImage) //
After calling the function, pImage will be //
initialized with the address of the image
// data buffer
10
Software Architecture of the Users Program
Update screen
Ogre main program 1. Create Webcam object 2.
Create a plane to cover up the whole
screen 3. Define the material of that
plane 4. Create the texture object to be used
on that plane
processCalculation() 1. Grab 1 video
frame to buffer 2. Get the pointer of the
buffer 3. Put the data in the buffer
on the texture object of the plane


Finish update screen
11
Create a Plane
  • Plane is a primitive in Ogre
  • We can change the size, position, orientation and
    the texture of a plane
  • In our previous labs, the ferris wheel is built
    on a plane laid horizontally with grass texture
  • In this lab, we are going to create a plane that
    covers the whole screen
  • We shall set the texture of the plane that allows
    us to put the data in the image buffer on it

12
OgrePlane()
  • The Plane class of Ogre defines a plane in the 3D
    space
  • There are a few constructors for the Plane class.
    The most common one is as follows

y
Plane plane(Vector3UNIT_Z, 0) // Define the
normal of the // plane is the z-axis and the //
distance from the origin is 0
(0,0,0)
x
z
plane
13
createPlane()
  • We need to register the plane so that we can use
    it in our project
  • The createPlane() member function of MeshManager
    takes in a Plane definition and makes a mesh from
    the parameters
  • This registers our plane for use, e.g.

MeshManagergetSingleton().createPlane( "webcamPl
ane", ResourceGroupManagerDEFAULT_RESOURCE_GROUP
_NAME, plane, 320, 240, 20, 20, true, 1, 1, 1,
Vector3UNIT_Y) // A plane called webcamPlane
// is registered. The size is 320 x 240. The //
plane stands parallel to the y-axis
14
Show the Plane on the Screen
  • Similar to other mesh objects, we show the plane
    on the screen by first creating an entity of the
    plane and calling attachObject() and
    setPosition(), e.g.

Entity ent mSceneMgr-gtcreateEntity("PlaneEntity
", "webcamPlane") SceneNode node
mSceneMgr-gtgetRootSceneNode() -gtcreateChildSceneNo
de() node-gtattachObject(ent) //
node-gtsetPosition(x??,y??,z??) // Choose an
appropriate value for x, y, z such that // the
plane will cover the whole screen
15
Set the Material
  • When we manually created the plane, we did not
    specify what texture to use on it
  • In fact, we want to put the image data received
    from the camera to the plane
  • We first define the name of the material used for
    that plane as follows
  • And we shall define the texture of such material
    later

Entity ent mSceneMgr-gtcreateEntity("PlaneEntity
", "webcamPlane") ent-gtsetMaterialName("Webcam/My
Material")
16
Set the Texture of the Material
  • We create the required texture by instantiating
    an object of the TextureSystem class

mTextureSystem new TextureSystem(320,
240) //Create the required texture with size
320x240 OgreMaterialPtr mat
OgreMaterialManager getSingleton().getByName
("Webcam/MyMaterial") //Get the pointer of a
material with name // Webcam/MyMaterial
the same used by the plane OgreTextureUnitState
tex mat-gtgetTechnique(0) -gtgetPass(0)-gtgetTe
xtureUnitState(0) //Get the pointer of the
texture unit of that material tex-gtsetTextureName(
mTextureSystem -gtGetTexture()-gtgetName()) //Se
t the name of that texture unit the same as
that // created by TextureSystem, i.e. the
plane that uses // the material
Webcam/MyMaterial will have the // texture
the same as that created by TextureSystem
17
TextureSystem Class
  • A class called TextureSystem is introduced in
    this lab to handle the creation and update of the
    texture of the material Webcam/MyMaterial
  • Four major public member functions
  • TextureSystem() the constructor creates the
    texture
  • GetTexture() Get the pointer of the created
    texture
  • CleanTextureContents() Reset all pixels of the
    texture to 0 (paint it all black)
  • UpdateTexture() Copy a frame of video data to
    the texture

18
TextureSystem Class (cont)
class TextureSystem public TextureSystem(int
width, int height) TextureSystem(void) Ogre
TexturePtr GetTexture() /// Obtain the
pointer of the ogre texture void
UpdateTexture(BYTE pBmpTmp) ///
Copy a frame of video data to the texture void
CleanTextureContents() /// Clean the
full texture (paint it all black) protected Ogre
TexturePtr mTexture /// Texture for
rendering the video data OgreReal
mTexWidth /// Real texture width OgreReal
mTexHeight /// Real texture height
19
TextureSystem TextureSystem()
TextureSystemTextureSystem(int width, int
height) mTexWidthwidth mTexHeightheight
// Create the texture we are going to
use mTextureOgreTextureManagergetSingleton()
. createManual( "WebcamManualTexture", //
name OgreResourceGroupManager DEFAULT_RES
OURCE_GROUP_NAME, OgreTEX_TYPE_2D, //
texture type mTexWidth, mTexHeight, 0, //
number of mipmaps OgrePF_BYTE_BGRA, // pixel
format OgreTU_DYNAMIC_WRITE_ONLY_DISCARDABLE
)
20
TextureSystem GetTexture()
  • Just return the pointer of the created texture,
    i.e. mTexture member variable

OgreTexturePtr TextureSystemGetTexture() re
turn mTexture
21
TextureSystem CleanTextureContents()
void TextureSystemCleanTextureContents() unsig
ned int idx int x, y // lock the pixel buffer
and get a pixel box for (x0, y0 ylttexh
) idx(x4)ytexw4 pDestidx0 //blue
pDestidx10 //green pDestidx20 //red
pDestidx3255 //alpha (255 -gt
opaque) x if (xgttexw) x0 y
// Unlock the pixel buffer
22
TextureSystem CleanTextureContents()
Pixel 0
Pixel 1
Pixel 319
pDest
b g r a
b g r a
b g r a

b g r a
Pixel 320
..
b g r a
Pixel 239320
23
TextureSystem UpdateTexture()
void TextureSystemUpdateTexture(BYTE
pBmpTmp) unsigned int idx int x, y // lock
the pixel buffer and get a pixel box //
Input parameter pBmpTmp is a BYTE pointer
which // gives the address of the buffer where
the frame // of video data is stored // Get
the data BYTE by BYTE from pBmpTmp and put //
them to the pixel buffer of the
Texture // Unlock the pixel
buffer
24
Tasks to be Achieved
  • In this lab, you are asked to
  • Complete the routine to create the plane, set
    material and texture
  • Complete the implementation of the member
    function UpdateTexture()
  • Show the video in the Ogre environment
  • Drive the car based on the live video received
    from the wireless camera installed on the car
Write a Comment
User Comments (0)
About PowerShow.com