I deal today with opencv and I fix some of my errors.
One is this error I got with cv2.VideoCapture. When I try to used with load video and createBackgroundSubtractorMOG2() i got this:
cv2.error:
C:\builds\master_PackSlaveAddon-win64-vc12-static\opencv\modules\highgui\src\window.cpp:281:
error: (-215) size.width<0 amp="" cv::imshow="" function="" i="" in="" size.height="">0>
You need also to have opencv_ffmpeg310.dll and opencv_ffmpeg310_64.dll into your Windows C:\Windows\System32, this will help me to play videos.
Now make sure you have the opencv version 3.1.0 because opencv come with some changes over python.
C:\Python27\python
Python 2.7.8 (default, Jun 30 2014, 16:08:48) [MSC v.1500 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>>import cv2
>>>print cv2.__version__
3.1.0
You can take some infos from about opencv python module - cv2 with:
>>>cv2.getBuildInformation()
...
>>>cv2.getCPUTickCount()
...
>>>print cv2.getNumberOfCPUs()
...
>>>print cv2.ocl.haveOpenCL()
True
You can also see some error by disable OpenCL:
>>>cv2.ocl.setUseOpenCL(False)
>>>print cv2.ocl.useOpenCL()
False
Now will show you how to use webcam gray and color , and play one video:
webcam color
import numpy as np
import cv2
cap = cv2.VideoCapture(0)
while(True):
ret, frame = cap.read()
cv2.imshow('frame',frame)
if 0xFF & cv2.waitKey(5) == 27:
break
cap.release()
cv2.destroyAllWindows()
webcam gray
import numpy as np
import cv2
cap = cv2.VideoCapture(0)
while(True):
ret, frame = cap.read()
gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)
cv2.imshow('frame',gray)
if 0xFF & cv2.waitKey(5) == 27:
break
cap.release()
cv2.destroyAllWindows()
play video
import cv2
from cv2 import *
capture = cv2.VideoCapture("avi_test_001.avi")
while True:
ret, img = capture.read()
cv2.imshow('some', img)
if 0xFF & cv2.waitKey(5) == 27:
break
cv2.destroyAllWindows()
Is a blog about python programming language. You can see my work with python programming language, tutorials and news.
Saturday, June 25, 2016
OpenGL and OpenCV with python 2.7 - part 002.
Posted by
Cătălin George Feștilă
Labels:
2.7,
2016,
cv2,
matplotlib,
numpy,
OpenCV,
opengl,
pyopengl,
python
Wednesday, June 22, 2016
OpenGL and OpenCV with python 2.7 - part 001.
First you need to know what version of python you use.
You need also to download the OpenCV version 3.0 from here.
Then run the executable into your folder and get cv2.pyd file from \opencv\build\python\2.7\x64 and paste to \Python27\Lib\site-packages.
If you use then use 32 bit python version then use this path: \opencv\build\python\2.7\x86.
Use pip to install next python modules:
Let's see how is working OpenGL:
You can also use dir(module) to see more. You can import all from GL, GLU and GLUT.
If you are very good with python OpenGL module then you can import just like this example:
Most of this OpenGL need to have a valid OpenGL rendering context.
For example you can test it with WGL ( WGL or Wiggle is an API between OpenGL and the windowing system interface of Microsoft Windows):
Now , let's see the OpenCV python module with s=one simple webcam python script:
C:\Python27>python
Python 2.7.8 (default, Jun 30 2014, 16:08:48) [MSC v.1500 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>>
You need also to download the OpenCV version 3.0 from here.
Then run the executable into your folder and get cv2.pyd file from \opencv\build\python\2.7\x64 and paste to \Python27\Lib\site-packages.
If you use then use 32 bit python version then use this path: \opencv\build\python\2.7\x86.
Use pip to install next python modules:
C:\Python27\Scripts>pip install PyOpenGL
...
C:\Python27\Scripts>pip install numpy
...
C:\Python27\Scripts>pip install matplotlib
...
Let's see how is working OpenGL:
C:\Python27>python
Python 2.7.8 (default, Jun 30 2014, 16:08:48) [MSC v.1500 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import OpenGL
>>> import numpy
>>> import matplotlib
>>> import cv2
>>> from OpenGL import *
>>> from numpy import *
>>> from matplotlib import *
>>> from cv2 import *
You can also use dir(module) to see more. You can import all from GL, GLU and GLUT.
>>> dir(OpenGL)
['ALLOW_NUMPY_SCALARS', 'ARRAY_SIZE_CHECKING', 'CONTEXT_CHECKING', 'ERROR_CHECKING', 'ERROR_LOGGING', 'ERROR_ON_COPY', 'FORWARD_COMPATIBLE_ONLY', 'FULL_LOGGING', 'FormatHandler', 'MODULE_ANNOTATIONS', 'PlatformPlugin', 'SIZE_1_ARRAY_UNPACK', 'STORE_POINTERS', 'UNSIGNED_BYTE_IMAGES_AS_STRING', 'USE_ACCELERATE', 'WARN_ON_FORMAT_UNAVAILABLE', '__builtins__', '__doc__', '__file__', '__name__', '__package__', '__path__', '__version__', '_bi', 'environ_key', 'os', 'plugins', 'sys', 'version']
>>> from OpenGL.GL import *
>>> from OpenGL.GLU import *
>>> from OpenGL.GLUT import *
>>> from OpenGL.WGL import *
If you are very good with python OpenGL module then you can import just like this example:
>>> from OpenGL.arrays import ArrayDatatype
>>> from OpenGL.GL import (GL_ARRAY_BUFFER, GL_COLOR_BUFFER_BIT,
... GL_COMPILE_STATUS, GL_FALSE, GL_FLOAT, GL_FRAGMENT_SHADER,
... GL_LINK_STATUS, GL_RENDERER, GL_SHADING_LANGUAGE_VERSION,
... GL_STATIC_DRAW, GL_TRIANGLES, GL_TRUE, GL_VENDOR, GL_VERSION,
... GL_VERTEX_SHADER, glAttachShader, glBindBuffer, glBindVertexArray,
... glBufferData, glClear, glClearColor, glCompileShader,
... glCreateProgram, glCreateShader, glDeleteProgram,
... glDeleteShader, glDrawArrays, glEnableVertexAttribArray,
... glGenBuffers, glGenVertexArrays, glGetAttribLocation,
... glGetProgramInfoLog, glGetProgramiv, glGetShaderInfoLog,
... glGetShaderiv, glGetString, glGetUniformLocation, glLinkProgram,
... glShaderSource, glUseProgram, glVertexAttribPointer)
Most of this OpenGL need to have a valid OpenGL rendering context.
For example you can test it with WGL ( WGL or Wiggle is an API between OpenGL and the windowing system interface of Microsoft Windows):
>>> import OpenGL
>>> from OpenGL import *
>>> from OpenGL import WGL
>>> print WGL.wglGetCurrentDC()
None
Now , let's see the OpenCV python module with s=one simple webcam python script:
import numpy as np
import cv2
cap = cv2.VideoCapture(0)
while(True):
ret, frame = cap.read()
gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)
cv2.imshow('frame',gray)
if cv2.waitKey(1) & 0xFF == ord('q'):
break
cap.release()
cv2.destroyAllWindows()
This is result of my webcam:
Posted by
Cătălin George Feștilă
Labels:
2.7,
2016,
cv2,
matplotlib,
numpy,
OpenCV,
opengl,
pyopengl,
python
Tuesday, June 21, 2016
Scrapy python module - part 001.
To install pip under python 2.7.8, securely download get-pip.py into Python27 folder.
Use this command:
Some of python modules are installed:
Now you need to install win32api with this python module:
... and test scrapy bench:
Into the next tutorial I will try to use scrapy.
If you have some ideas about how to do the next step just send me one comment.
Use this command:
C:\Python27\python.exe get-pip.py
...
C:\Python27\Scripts>pip2.7.exe install urllib3
C:\Python27\Scripts>pip2.7 install requests
C:\Python27\Scripts>pip install Scrapy
Some of python modules are installed:
Successfully built PyDispatcher pycparser
Installing collected packages: cssselect, queuelib, six, enum34, ipaddress, idna, pycparser, cffi, pyasn1, cryptography, pyOpenSSL, w3lib, lxml, parsel, PyDispatcher, zope.interface, Twisted, attrs, pyasn1-modules, service-identity, Scrapy
Successfully installed PyDispatcher-2.0.5 Scrapy-1.1.0 Twisted-16.2.0 attrs-16.0.0 cffi-1.7.0 cryptography-1.4 cssselect-0.9.2 enum34-1.1.6 idna-2.1 ipaddress-1.0.16 lxml-3.6.0 parsel-1.0.2 pyOpenSSL-16.0.0 pyasn1-0.1.9 pyasn1-modules-0.0.8 pycparser-2.14 queuelib-1.4.2 service-identity-16.0.0 six-1.10.0 w3lib-1.14.2 zope.interface-4.2.0
>>> print scrapy.version_info
(1, 1, 0)
>>> help(scrapy)
PACKAGE CONTENTS
_monkeypatches
cmdline
command
commands (package)
conf
contracts (package)
contrib (package)
contrib_exp (package)
core (package)
crawler
downloadermiddlewares (package)
dupefilter
dupefilters
exceptions
exporters
extension
extensions (package)
http (package)
interfaces
item
link
linkextractor
linkextractors (package)
loader (package)
log
logformatter
mail
middleware
pipelines (package)
project
resolver
responsetypes
selector (package)
settings (package)
shell
signalmanager
signals
spider
spiderloader
spidermanager
spidermiddlewares (package)
spiders (package)
squeue
squeues
stats
statscol
statscollectors
telnet
utils (package)
xlib (package)
...
C:\Python27\c:\Python27\Scripts\scrapy.exe startproject test_scrapy
New Scrapy project 'test_scrapy', using template directory 'c:\\python27\\lib\\site-packages\\scrapy\\templates\\project', created in:
C:\Python27\test_scrapy
You can start your first spider with:
cd test_scrapy
scrapy genspider example example.com
C:\Python27\cd test_scrapy
C:\Python27\test_scrapy>tree
Folder PATH listing
Volume serial number is 9A67-3A80
C:.
└───test_scrapy
└───spiders
Now you need to install win32api with this python module:
pip install pypiwin32
...
Downloading pypiwin32-219-cp27-none-win_amd64.whl (7.3MB)
100% |################################| 7.3MB 61kB/s
Installing collected packages: pypiwin32
Successfully installed pypiwin32-219
... and test scrapy bench:
C:\Python27\Scripts\scrapy.exe bench
2016-06-21 22:45:20 [scrapy] INFO: Scrapy 1.1.0 started (bot: scrapybot)
2016-06-21 22:45:20 [scrapy] INFO: Overridden settings: {'CLOSESPIDER_TIMEOUT': 10, 'LOG_LEVEL': 'INFO', 'LOGSTATS_INTERVAL': 1}
2016-06-21 22:45:39 [scrapy] INFO: Enabled extensions:
['scrapy.extensions.closespider.CloseSpider',
'scrapy.extensions.logstats.LogStats',
'scrapy.extensions.telnet.TelnetConsole',
'scrapy.extensions.corestats.CoreStats']
2016-06-21 22:45:46 [scrapy] INFO: Enabled downloader middlewares:
['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',
'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',
'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',
'scrapy.downloadermiddlewares.retry.RetryMiddleware',
'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',
'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',
'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',
'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',
'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',
'scrapy.downloadermiddlewares.chunked.ChunkedTransferMiddleware',
'scrapy.downloadermiddlewares.stats.DownloaderStats']
2016-06-21 22:45:46 [scrapy] INFO: Enabled spider middlewares:
['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
'scrapy.spidermiddlewares.offsite.OffsiteMiddleware',
'scrapy.spidermiddlewares.referer.RefererMiddleware',
'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
'scrapy.spidermiddlewares.depth.DepthMiddleware']
2016-06-21 22:45:46 [scrapy] INFO: Enabled item pipelines:
[]
2016-06-21 22:45:46 [scrapy] INFO: Spider opened
2016-06-21 22:45:46 [scrapy] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
2016-06-21 22:45:48 [scrapy] INFO: Crawled 27 pages (at 1620 pages/min), scraped 0 items (at 0 items/min)
2016-06-21 22:45:49 [scrapy] INFO: Crawled 59 pages (at 1920 pages/min), scraped 0 items (at 0 items/min)
2016-06-21 22:45:50 [scrapy] INFO: Crawled 85 pages (at 1560 pages/min), scraped 0 items (at 0 items/min)
2016-06-21 22:45:51 [scrapy] INFO: Crawled 123 pages (at 2280 pages/min), scraped 0 items (at 0 items/min)
2016-06-21 22:45:52 [scrapy] INFO: Crawled 149 pages (at 1560 pages/min), scraped 0 items (at 0 items/min)
2016-06-21 22:45:53 [scrapy] INFO: Crawled 181 pages (at 1920 pages/min), scraped 0 items (at 0 items/min)
2016-06-21 22:45:54 [scrapy] INFO: Crawled 211 pages (at 1800 pages/min), scraped 0 items (at 0 items/min)
2016-06-21 22:45:55 [scrapy] INFO: Crawled 237 pages (at 1560 pages/min), scraped 0 items (at 0 items/min)
2016-06-21 22:45:56 [scrapy] INFO: Crawled 269 pages (at 1920 pages/min), scraped 0 items (at 0 items/min)
2016-06-21 22:45:57 [scrapy] INFO: Closing spider (closespider_timeout)
2016-06-21 22:45:57 [scrapy] INFO: Crawled 307 pages (at 2280 pages/min), scraped 0 items (at 0 items/min)
2016-06-21 22:45:57 [scrapy] INFO: Dumping Scrapy stats:
{'downloader/request_bytes': 97844,
'downloader/request_count': 317,
'downloader/request_method_count/GET': 317,
'downloader/response_bytes': 469955,
'downloader/response_count': 317,
'downloader/response_status_count/200': 317,
'dupefilter/filtered': 204,
'finish_reason': 'closespider_timeout',
'finish_time': datetime.datetime(2016, 6, 21, 19, 45, 57, 835000),
'log_count/INFO': 17,
'request_depth_max': 14,
'response_received_count': 317,
'scheduler/dequeued': 317,
'scheduler/dequeued/memory': 317,
'scheduler/enqueued': 6136,
'scheduler/enqueued/memory': 6136,
'start_time': datetime.datetime(2016, 6, 21, 19, 45, 46, 986000)}
2016-06-21 22:45:57 [scrapy] INFO: Spider closed (closespider_timeout)
Into the next tutorial I will try to use scrapy.
If you have some ideas about how to do the next step just send me one comment.
Posted by
Cătălin George Feștilă
Labels:
2.7,
module,
pip,
pip2.7,
python,
scrapy,
windows,
winodws 10
Subscribe to:
Posts (Atom)