• Projects
  • 3Dconnexion Space Mouse Support in Godot

cybereality
Cool
Another potential use for control of the editor view is adding camera view shortcuts. Like place the camera looking at a part of your scene then bind that transform to a key. There's already the 6 axis view shortcuts, but no free view recording that I can see.

For some reason I can't get release builds of 4.0 to work (comes up with an error about missing .pck), but debug builds work fine (but slower). Probably some setting got messed up when I followed some online instructions to get godot compiling from visual studio (although still routed through scons, annoyingly).

It's working!!! @Kojack @Megalomaniak

I gave up the idea of editing the source code. Though I did get it sort of working, that Cursor class is used for too much and it was causing all sorts of crazy bugs. Even if I did finish it, there was the problem of the z axis, so there was always going to be some jump. I decided to just use what was available as a plugin. This means the 6DOF state is not saved, but it doesn't mess up the normal function of the mouse and keyboard. I'll have to add cross-platform support for the extension, but the plugin itself is fully working.

    cybereality It might be handy to have a 5DOF mode. I find roll can get annoying at times in editors. 🙂
    Cool work though!

    I actually find roll helpful since it gives you a difference perspective. This is important in 3D because you are literally missing a dimension. It's sort of like how traditional artists back in the day were told to view their drawings in a mirror, because it shows you things you can't see normally.

    Still, a turntable mode might have it's uses too. And a fly mode. But for initial support these can probably be considered "fluff".

    Then again, if you are using the Connexion sdk, the control panel can already disable roll (at least on windows).

      I guess it is sort of a turntable. It rotates around the selected object right now. You can kind of still fly wherever, since you have full control, but it gets difficult if you move more than around 6 units from the center (of the current object). Fly mode I can look into. I have tried it before on other apps and it never worked well. But it's worth investigating.

      Kojack Then again, if you are using the Connexion sdk, the control panel can already disable roll (at least on windows).

      Yes, I can put in some options. That shouldn't be an issue. I have to do that anyway to control the speed.

        cybereality I guess linux may not have the same thing. But in windows there's already a configuration panel in the drivers for apps that use the sdk which gives axis scales, turning axes on/off, etc.

        No Linux never had that. It was just a service with no GUI, but it hasn't worked in a long time. I looked into the official SDK, but it looks overly complicated and I'm not especially fond of the license (particularly because I plan to release the plug-in open source). While I don't have to open source the DLL, it still makes me nervous. I think I did enough for today, but I'll see about either the HIDAPI or porting libspnav to Windows/macOS. Not sure which would be easier, but I feel like contributing to libspnav may be more worthwhile in the long run.

          cybereality Wait, sorry, I'd already misremembered the earlier part of the thread and thought you were using the linux sdk. Ignore my last post.
          🙂

          I like HID because I can also use it to get access to all features of things like PS5 controllers (touch pad, IMU, etc).

          Yeah, the official sdk is rather excessive. For my own simple Connexion library (not my full multi-device input library) that I use with things like Unity, the entire interface is just:

          struct ConnexionState
          {
          	float pos_x;
          	float pos_y;
          	float pos_z;
          	float rot_x;
          	float rot_y;
          	float rot_z;
          	unsigned int buttons;
          };
          int init();
          void poll();
          unsigned int getDeviceCount();
          unsigned int getDeviceType(unsigned int deviceIndex);
          ConnexionState getState(unsigned int deviceIndex);
          ConnexionState getStatePrevious(unsigned int deviceIndex);
          unsigned int buttonPressed(unsigned int deviceIndex, unsigned int button);
          unsigned int buttonReleased(unsigned int deviceIndex, unsigned int button);
          unsigned int buttonDown(unsigned int deviceIndex, unsigned int button);
          void setLED(unsigned int deviceIndex, unsigned int value);
          void setRotationPower(unsigned int deviceIndex, float p);
          void setTranslationPower(unsigned int deviceIndex, float p);

          This is where I still applaud Microsoft for the XInput API. Unlike almost every other API they've made, XInput is so incredibly simple to use. No setup/init. One function call gives you the state of every part of an xbox controller in a simple struct. One function call lets you set haptics. Shame it doesn't support other brand devices like DirectInput does.

          Side note, I don't understand these SDKs that are so over-engineered. I just want a single header file, like 5 functions, and a simple command-line sample. Not sure why they have to make things so obtuse.

            cybereality Back a long time ago I got a TrackIR (head tracker camera for flight sims).
            I wanted to add support to my college game engine.
            I had to apply to be a developer (they hand picked them). They wanted me to sign an NDA, because the C++ header of their API contained proprietary secrets or some crap. WTF? All I needed was an init function and something to return 6 floats. There's nothing else the device needs to be used.
            They also said not only could I not allow any student to see their header, but also no co-worker at the college could see it. I'd have to make my own binary only API that wrapped their API and keep the source secret.
            I never responded after I read that.

            Later I found out that they were so paranoid about their software that they made the camera (a USB IR webcam) only turn on if you sent a copyrighted haiku to it, so any third party drivers could be sued for copyright infringement. They also extorted companies like Eagle Dynamics (DCS flight sims) to remove support for competing alternative head tracking devices under threat of their games being blacklisted by the TrackIR driver.

            Urgh.

            Still not my strangest contract. I bought a Beagleboard (the spiritual ancestor of the Raspberry Pi that started the small ARM SOC board craze). But due to some encryption code in the firmware, the US treated it as a military device for export and to get it shipped to Australia I had to sign a document from the Department of Energy that said I had no dealings with a list of known terrorist collaborator companies. Also I saw the term "weapons of mass destruction" in the doc at least once. It was a tiny arm board equivalent to maybe an iphone 3.
            I've still got the Beagleboard on my shelf, never really used it for anything (it was going to be a dev kit for the Pandora handheld, but then my Pandora shipped).

            Yeah. Then these companies wonder why they go out of business.

            I got HIDAPI working. It only took a few hours, way easier than I thought. Everything is hard-coded for my device though, so it will take some time to configure for all the various models. I found this code, which has the product and vendor ids, which is a great help.

            https://github.com/johnhw/pyspacenavigator/blob/master/spacenavigator.py

            It also has some of the different mapping formats of the data. However, my device seemed to work differently (even though it is listed there with the correct VID/PID). The axes were not in the right order (meaning not x, y, z) and some were negated. I did get it all working, but I expect that not all devices conform to this format, as you can see in the link above. I have one older wired Space Navigator, so I can try that later. That should give me an idea, but there are like 8 or so devices I would need to support.

            Also, HID has WAY better performance than the libspnav. There is no more choppiness and everything is butter smooth at 144Hz. So that is a nice bonus. And the code should compile on Windows and macOS, but I will have to verify that tomorrow. So this was definitely the right plan.

              cybereality
              Yep, I believe the protocol changed when they went from Space to SpaceMouse. The event 3 (buttons and long press buttons) changed to event 28 (buttons) and 29 (long press buttons).

              Hehe, I clicked that link, firefox showed me I already had it bookmarked. 🙂

              Well, I'm not planning on supporting buttons, just axis control, so that simplifies things. If there are only two protocols for the axis that should be simple. I just don't want to deal with trying to support a bunch of devices I don't own. But I think there is enough open source examples for me to attempt it with what I have.

              Also, I noticed that the movement is not exactly the same. All I did was make a new shared lib, I left the interface the same, so I just dropped it into my plugin and everything basically worked. However it feels very different. I think libspnav was doing some sort of low pass filter or clamping or something. Because it was choppier before, but not noisy. Now it is super smooth, but things seem to move in unintended directions. So I'll have to look into that next, maybe tomorrow. But I have the basic code working, it just a matter of tweaking some numbers.

              That's why my input library has a rather odd collection of devices it supports, so far only things I directly own. 🙂

              I do find the values are jittery when looking at them directly, but I've never noticed it when using them to control things.

              A couple of other things to consider if you haven't already done so:
              The range of the axes can vary, not just by model but individual device. The general range is considered to be -350 to 350, but for example my space pilot pro vertical axis has a range of around -410 (up) to 532 (down), but I have to push very hard on it to hit those extremes. The X axis is similar, left has a smaller range than right.
              What I do is start with -350 to 350 as the min/max for each axis. If I get a larger value, I adjust the range to fit it. The returned values from my lib are then scaled to fit -1.0 to 1.0 to the current range.

              Also making the axes nonlinear can be good. Put a power curve on it so small movements from centre have less effect. The official app only lets you change scale, mine has user specified power as well.

              Here's my input test app where I reverse engineer these things. 🙂

              Okay, cool. I think I see the issue. I have a command line app just printing values, and I see some odds one when barely moving the puck. Like it will be at 0 and then jump to 94 or some high number when I am only slightly touching it. So maybe my code is not correct, or there is some conversion issues. Because the numbers look fairly plausible most of the time but seem biased in one direction and then also extremely noisy at close to zero. It's getting late, though, I will investigate tomorrow. Thanks for the advice.

              Pretty sure the windows driver & control software will run a calibration step on first launch, and it's not optional.