Attention: Old PyARTK module is obsolete for a long time. Now the new ARTKBlender module is available. It works on current version of Blender and provides similar functionality as old one.
Blender 2.49 was released recently with new VideoTexture module derived from my blendVideoTex plug-in. It shows that plug-in is not needed anymore and I officially stopped its development.
I tried to run demo for PyARTK on new Blender, but it failed to detect markers. I spent several hours to find a source of this error and finally found it. It seems that VideoTexture module doesn’t provide correct video frame size after video replay was started. That’s why incorrect frame size was set to PyARTK and markers weren’t detected. So in this demo version is frame size hardcoded in script.
There is updated demo for download working on Blender 2.49.
For Blender 2.49a that uses Python 2.6 you’ll need new PyARTK binaries.
72 thoughts on “PyARTK demo for Blender 2.49”
Ok, here we go:
Python 2.6.2 (release26-maint, Apr 19 2009, 01:58:18)
[GCC 4.3.3] on linux2
Type “help”, “copyright”, “credits” or “license” for more information.
>>> from PyARTK import arlib
>>> idx = arlib.arLoadPatt(’4x4_44.patt’)
File “”, line 1
idx = arlib.arLoadPatt(’4x4_44.patt’)
SyntaxError: invalid syntax
>>> idx = arlib.arLoadPatt(“4x4_44.patt”)
>>> print idx
>>> idx2 = arlib.arLoadPatt(“4x4_42.patt”)
>>> print idx2
It seems PyARTK is not the problem. As you can see, python didn’t like the ‘ symbol.
I haven’t seen this yet. For me it works the same way. Python documentation says both quotations are functionally the same. But if double quotes works for you, change it in demo and let me know if you have succedded.
Best regards, Ash
I still got the same TypeError 🙁 , even using a variable to pass the argument.
By the way, I dont know if it has something to do with the issue but when trying test_PyARTK.py I get:
*** Camera Parameter ***
SIZE = 640, 480
Distortion factor = 318.000000 278.000000 -21.225000 0.979330
812.08079 0.00000 308.00000 0.00000
0.00000 808.77698 230.00000 0.00000
0.00000 0.00000 1.00000 0.00000
[[143 155 135 …, 169 183 186]
[176 186 196 …, 140 145 141]
[130 136 126 …, 151 162 156]
[127 126 96 …, 185 203 203]
[180 194 203 …, 65 67 64]
[ 68 73 69 …, 102 105 112]]
[[112 105 102 …, 69 73 68]
[ 64 67 65 …, 203 194 180]
[203 203 185 …, 96 126 127]
[156 162 151 …, 126 136 130]
[141 145 140 …, 196 186 176]
[186 183 169 …, 135 155 143]]
it looks like some deep error is hidden behind these symptoms. I’m sorry, but I’m not able to investigate it.
Don’t worry. I’ll try in a 32bit Ubuntu and also in XP, to see if its all because of the 64bits.
Thank you for your attention!
I managed to make it work in XP, changing all the paths to the files in the scripts. However, it only works when I open the .blend file directly form the system’s file browser, making double click. If I open the file from blender, the game engine only shows a grey texture on the plane object.
To make the camera work I had to use the ‘cam’,0,30,320,240 parameters published above in the comments.
In Ubuntu 32 bits, I compiled the module but it also shows just a grey plane object.
problem is in working folder. It differs, if you open .blend file by doubleclick or you open it from blender. If you want to avoid this difference, use full path in video file names (source property).
hi Ashsid, i was wondering if this scripts would run in a beagleboard … I am getting one in the next week, i cant wait to try the BGE in it…
Hi Ash, your game is very fantastic! Is too fantastic.. I like your idea so much, because it, I started my project on College with augmented reality. but I’m having trouble using it in Blender, I downloaded Blender 2.49 and your PYARTK Binaries and install in site-packages, and I executed your game example, but I can’t look the video of my cam, I just saw the marker in the blender with pressed “P” in the program…
you help me?
sorry my newbye english..
plz add me on msn: firstname.lastname@example.org
AR + Blender xD! So Nice but it’s a pity I couldn’t see it
Could anyone post how to do it properly, step by step for newbies linux users?
I already installed openvrml from synaptic then I download PyARTK binaries but I coudn’t install it.
python setup.py install
/usr/lib/python2.5/distutils/extension.py:133: UserWarning: Unknown Extension options: ‘swig_cpp’
building ‘_arlib’ extension
swigging PyARTK/arlib.i to PyARTK/arlib_wrap.c
swig -python -shadow -cpperraswarn -I/usr/include/include/openvrml -I/usr/include/openvrml -I/usr/include -o PyARTK/arlib_wrap.c PyARTK/arlib.i
PyARTK/arlib.i:235: Error: Unable to find ‘config.h’
error: command ‘swig’ failed with exit status 1
for solving compilation problems look at PyARTK pages http://mgldev.scripps.edu/projects/pyartk/index.html
your work is great I’m doing in college poject on augmented reality, I’m learning a lot from their examples. But this last I’m had a problem with blender and PYARTK while executing you demo for bender2.49
My OS is: Windows 7 Ultimate 64Bits
Python script error from controller “cont1#CONTR#2”:
Traceback (most recent call last):
File “detect.py”, line 4, in
ImportError: No module named blendVideoTex
Can you help-me?
I think you have older version of demo. There’s no reference to blendVideoTex module in current version for Blender 2.49. Try to download actual version on top of this page.
Hi Ash, Tnks for help!
I did what you told me, and then ran the sample that has the video background( in layer 1).
But when using the example that runs the webcam the video was very slow.
So I decided to do some tests:
1 – when I ran the example with webcam without markers, functioned normally, taking the slow webcam
2 -running and running without the markers initially decided by the markers on the screen with the program running, but ran without recognizing the markers, and after a few seconds the video caught (video stoped)
3- third test was run with the markers already positioned in view of the webcam, it made the video screen turned gray and without video image
shell was show-me this error:
Python script error from controller “cont1#CONTR#2”:
Traceback (most recent call last):
File “detect.py”, line 82, in
File “detect.py”, line 27, in calcPos
File “matOps.py”, line 57, in matMult
IndexError: list index out of range
Blender Game Engine Finished
Thanks again Ash, you’re helping me a lot, I would also ask you if I can cite you in my work in college completion (final project of college)
I have uploaded fixed demo for this error you reported.
Tnks Again Ash!
Searching on youtube, I found an example of his, this below:
and I ask you, how I can download this example?
that would be of great utilizade because my main goal is not to control through the markers
Is the display video and animation at the time the marker is recognized by the camera
I would like to do something like this example:
Tnks for Help!!
The video with monkey head was my first test with ARToolKit. You can do the same thing using my actual demo and it’ll look the same way as in the second video example. It all about setup of objects, no additional programming is needed.
Hello! I managed to find this in my endless searching for how to get my directshow capture device to work in BGE 2.62 videotexture. I can’t manage to get this demo file to work on my xp laptop with 2.49b and a webcam. Would you happen to know how to access a directshow source named “VHScrCap” through BGE in 2.62? I was looking at the sourcecode and it mentions nothing about directshow in videoffmpeg.cpp 🙁 and a couple comment lines seem to indicate it may be missing 🙁 I’d greatly appreciate any help or suggestions, thank you for the work you have done!
I have also problems to capture video from webcam on my notebook (Win7, Blender2.62). It runs only once after the boot, then it’s not working. Problem seems to be inside ffmpeg, because standalone ffmpeg.exe behaves the same way. ffmpeg doesn’t use directshow, but old interface vfw.
I have gotten it to work properly now thanks to Sergey and JesterKing in #blendercoders, if you still have problems you should come in there and let someone know, if there is a bug in blender’s ffmpeg perhaps they could patch it, you could also make a bug report ticket on the 2.62 bug tracker, thanks for replying to ancient stuff!
As for directshow, Blender (or blender’s ffmpeg) won’t support it, but in my endless searching for stuff about this I did come accross some page that said directshow support had been added to ffmpeg at some point, perhaps blender is using an outdated version; isn’t compiled with that; videoffmpeg.cpp doesn’t expose it due to it not existing when that was developed, or something like that. I found the post saying about directshow capture in ffmpeg here http://ffmpeg.zeranoe.com/forum/viewtopic.php?f=3&t=27 However in checking in graphedit as that post mentions, I only see the webcam vfw device, not the directshow devices that I have feeding into that (which causes dropped frames from the live virtual source in bge, a direct webcam was perfectly smooth though)
Okay sorry for the long post just wanted to let you know what I’ve come across. If there is indeed a bug I hope it can get fixed!