Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use with D8M-GPIP? #22

Open
kutenai opened this issue Mar 19, 2020 · 12 comments
Open

Use with D8M-GPIP? #22

kutenai opened this issue Mar 19, 2020 · 12 comments

Comments

@kutenai
Copy link

kutenai commented Mar 19, 2020

What would be required to make this work with the D8M-GPIP terrasic camera?

@electro-logic
Copy link

Hello,

It's not so easy, you need to write a driver to recognize the camera as a "webcam"

You can start to study the D8M here https://github.com/electro-logic/CameraVision to get same good images and learn how to configure it. Next step will be the driver, you can look for some open-source linux driver for OV8865 as starting point but you need to handle the FPGA/SOC interconnection (ex. maybe you can implement an AXI Stream interface to move the pixel to the ARM processor).

Cheers,
Leonardo

@kutenai
Copy link
Author

kutenai commented Mar 24, 2020

Thanks for the rapid feedback Leonardo.
I do not think I can tackle this task right now, so I'm looking at other approaches and a different development kit. I have a DE10-Nano, but I also have a CriticalLink Vision Development Kit.

I just need to work out how to get the OpenCL pipeline inserted into the image stream on that kit.

@electro-logic
Copy link

Hi, you don't need to move the pixel to the ARM processor if you only need OpenCL (and you don't need to write a driver). You can write an "RTL function" exposed to OpenCL that work with your camera. Can you explain with more details what do you want to achieve? I don't know the CriticalLink dev kit but to use OpenCL you need an OpenCL BSP that it's not so trivial.

@kutenai
Copy link
Author

kutenai commented Mar 24, 2020

I want to be able to demonstrate an inference engine implemented in the FPGA.
So, a live camera feed, process the results through an OpenCL CNN/Yolo3 inference engine and output the results to a format that could be used by a flight controller or some other type of vehicle control unit.

I do not expect the Cyclone V to be very fast, so we have no specific "FPS" target. If we can get a working prototype, we can then explore the option of building a system using a more powerful FPGA, like an Arria 10.

So, this task is really to explore and learn about the process, which includes:

  • BSP Development
  • Adding OpenCL Kernel to the BSP
  • Using OpenCL to implement a CNN or equivalent inference engine
  • Formatting the output so that it is usable to a downstream processing unit, or possibly the onboard ARM core (this is a project for later)

The CriticalLink has a working prototype. The camera input is a Basler 5MP camera. The input to the board is an BLVDS, and there is already VHDL code to convert that data to a vision stream. This vision stream is then put onto an Altera-ST streaming interface, saved to ram, copied from ram, merged with a 'background image', and stramed to an HDMI output.

All of that above is done in the FPGA, without OpenCL. I don't have (yet) the full BSP for the MitySOM Vision Development Kit [https://support.criticallink.com/redmine/projects/5csx_vdk_basler/wiki]

So I need to build a BSP for that board, and include the OpenCL kernel "Freeze Wrapper".

I am currently learning how to build a custom BSP.. and trying to learn more about OpenCL.
Your project is just a good starting point since it has the streaming already from USB to OopenCL, then back out..

@electro-logic
Copy link

Now your goal is much more clear.

  • OpenCL BSP development is not so trivial and (in my opinion) the only feasible option is to start from a reference design of a very similar board and modify it. From scratch it's almost impossible because the documentation is not so detailed, even manufacturers like Terasic start from reference BSP. Both Terasic and Criticallink provide an OpenCL BSP for you boards, why you need to develop a new one?

  • Once that you have an OpenCL BSP the compiler will merge it automatically with your kernel, this part it's the standard workflow, no problems here

  • I recommend you to develop on your pc the kernel using only the features supported by the FPGA and only when everything is working you can compile it for the FPGA. Compiling times are super long.

  • Remember for the D8M you have a lot of control but this means that you need to control the focus motor, gamma, color balance, etc.. to have good images. At the same time it's a good learning experience.

  • In my project I'm not using OpenCL but only NIOS (a soft-core) to handle the pc-fpga communication. The project is writing a frame from the camera to the SDRAM memory and then when a command is received the NIOS firmware is sending the frame to the PC with JTAG through the USB connector.

  • You can adapt the project to read the memory with the frame from an OpenCL kernel. The Mipi controller is writing the memory and OpenCL is reading so you need some kind of arbitration. The simplest way could be a simple RTL component that interface to the main OpenCL kernel through a channel that tell when the frame is ready in the memory and when a new frame can be written by the Mipi controller.

@kutenai
Copy link
Author

kutenai commented Mar 24, 2020

CriticalLink provides a reference BSP for their CycloneV System On a Module(SOM). This card is only part of the Vision Development Kit though, but it does include an example that has OpenCL. What is missing is a BSP with support for the VDK AND the OpenCL, so that is what I need to build.

I was looking at the DE10-Nano as an option, but the problem there is that there isn't a BSP with support for the DS8M (?) camera. I could possibly use the DE10-Nano with this reference BSP, and find a suitable "USB" camera for it. I don't have one at the moment though.

I'm not sure how large the OpenCL kernel can be in your BSP, or what the streaming options are into the kernel... as I learn more about BSP development, I'll study your version to at leaset learn more, and I can make a decision then.

@kutenai
Copy link
Author

kutenai commented Mar 24, 2020

Oh, also, for the VDK, the camera board is not a raw sensor, but a sensor board. This board handles the low-level details of the camera sensor, so no need to do a lot of configuration as I'd have to do with the D8M. Also, the VDK has a streaming interface already setup.

The trick is to insert the OpenCL kernel into the streaming interface for the VDK. The other issue is I do not have the actual BSP for the VDK -- I've requested that on the CriticalLink Forum, but no response yet -- to be fair I only asked about that this morning.

So, I'm trying to combine information from the two BSP's.

  • VDK with Image stream in the FPGA hardware
  • OpenCL example BSP's
  • Insert an OpenCL Kernel into the VDK stream, and then allow for some processing.

@electro-logic
Copy link

If you are not super interested in low-level details you can buy a DE10-nano and focus only on the OpenCL kernel using this c5soc_opencl project and connecting a webcam.
Why do you want to connect a camera sensor? If you want better quality you can have the best quality connecting through a capture device (ex. something like Elgato Capture 4K but linux compatible) and a mirrorless / reflex camera with a proper lens.

@kutenai
Copy link
Author

kutenai commented Mar 24, 2020

You make a good point. It won't hurt to spend some time with the DE10 and a web-cam, and I might find it is all I need, saving me time to build the BSP for the VDK..

I have a DE10-Nano, and I have a couple of USB cameras. a C930e, and a QuickCam 9000. Both have USB Type-A connectors. I'll need an adaptor or something to connect to the DE10-Nano. Need either USB Type-A to USB-Mini A or USB-Micro A -- not sure which port on the DE1- Nano to connect the USB camera to.

@electro-logic
Copy link

You probably need an USB hub to connect mouse, keyboard and webcam. I don't have the DE10-Nano but usually you can use any USB port connected to the ARM.

@kutenai
Copy link
Author

kutenai commented Mar 25, 2020

That is one reason we want to avoid the USB, we need to use the development kit in a portable application, and cannot plug all of those bits together. That is why I've been using the VDK.

I received some more information about the VDK, so I'll be spending some time working on a "BSP" for that one, as it is ideally suited to our application -- just without an OpenCL BSP.

I greatly appreciate all of your advice and help.

@electro-logic
Copy link

Hub, mouse and keyboard are needed only to ease the development, you can connect one webcam directly to the board when everything is finalized. Portable means you have to think to the power as well and DE10-Nano is really "nano", a good powerbank with a 3d printed case for both can do the job without too much hassle (there is a reference 3d model on the Terasic website) and the cable can be hidden inside the case (depending on your scenario having some cable allow you to position more finely the webcam).

Of course if this is a real project with some low production volume and you want something super portable and fine-tuned you can design a custom PCB using for example the MitySOM module but this is not an easy task because camera sensors are high-speed.

Anyway you know your requirements and if you already have the VDK.. give it a try

You are welcome,
Leonardo

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants