Let's Make Robots!

Reading USB mouse data in linux - somebody else C app and my python

Here is a slightly modified (added in hs and vs) c utility, and my python version. they will read the linux /dev/input/mouse data and convert it into status and x/y deltas

i noticed my python app seems quite different to the c app. not quite sure - default int()  types in c must be signed, i dont know... my second two bytes for x,y have MSB set if they are negative, but this value is also set in bits 5 and 6 of byte 1 from what my python tells me..

 

comments welcome!

AttachmentSize
cmouse.txt846 bytes
pythonmouse.txt1.08 KB

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

I have tested several different mice. With diods as well as one with laser. I feil to get accurate reading then mooving faster than 0.5m/s. Is it even possible and can someone give me a hint on this? Is it possible if i rewrote the drivers or used PS2 instead of USB? Or is it possible with mechanical mouse?

 

Currently I  plan to connect two USB mice to Azus Mini pc on the robot and i use linux as the OS.

What are you using to take the readings?

 

There is an app that comes with the linux gpm package called mev i have been looking at - my python script doesnt quite cut it

I am reading /dev/input/mice allmost like u.


You can see my code from there:

http://digi.physic.ut.ee/mw/index.php/Cmouse2.c

you lost me at pickle packets, my friend :D

So do you intend to use the EEPC as the processor? Because I think it would be a whole lot easier hooking up a PS/2 mouse to a general sensor- reading PICAXE system first... And then have it interface with a higher up motherboard, which will interface via one USB port of the EEPC. Or two, maybe if you want faster motor response time.

PS/2 because the connector has a more basic hookup- there is a X pin and a Y pin. Mice have something called overflow, which happens when the x/y data is more than 5 I think. So apparently, the x/y is just to show direction while the overflow watches the actual movement. Hope it helps!

 

The eepc is working nicely as the brains - i have made some slight changes to the python mouse driver and now have it functioning in a test feedback loop between its input and the motor drivers output. Theres not really anytyhing for it to process in regards to the mouse. its just a matter of reading 3 bytes from a file whenever i need the status - figuring out the order of the bytes was the hard part

 

Currently all of my devices are USB interfaces - serial for the motor and servo controller and arduino, plus USB mouse and camera devices. Its cheap and easy to expand.The arduino is for interfacing with anything lower level - currently its just reading values from the sonar to serial, which is read by another python app. i also have python daemons that talk via pickled packets over UDP to pass data between the different serial devices over a 'net connection.

 

 

Getting a GUI for Python like pygame or pyqt would help. Simply spread out a canvas or something and have it read the mouse position.

I have been very tempted while working on this just to kludge it up with pygame, however my bot's design does not make that feasible

The mouse device i want to read from is bolted to the underside of a chopping board, plugged into an eeepc fastened to the top - i dont want to worry about the overhead of running a gui, etc on a headless bot, plus this will make it easier to interface to other systems

I actually use both pygame and wxwidgets interfaces for "human control" of my bot, complete with feedback displayed from a servo mounted sonar on the front. wxwidgets sliders control the arm, and the pygame x,y positions are converted into commands for the left and right motors. 

The mouse input will be used to get an idea of how well the robots movement reflects the intentions of the commands sent to it - input for traction control essentially. if it works well i may even add mice to each wheel, so i can detect relative slippage, etc