Not all pins on the Leonardo and Micro support change interrupts, so only the following can be used for RX: 8, 9, 10, 11, 14 (MISO), 15 (SCK), 16 (MOSI).
“… addForce apply the force at the center of mass.” “simply cheat and do both a AddForce and Addtorque, then you have simple control over your inputs without having the headache of doing proper physics computation for achieving your controls.” “the other option is to use “atPosition” property, this will in effect apply a self force but not from the center of mass and thus imply a torque” http://docs.unity3d.com/ScriptReference/Rigidbody.AddForceAtPosition.html
Sherlock Holmes & The Dimensional Pipe is an AR experience that allows the participant to revisit mysterious crime cases that were solved by detective Holmes and Dr. Watson around 1900. This time and space entanglement between 1900 and 2022 was made possible by combining an alternative controller called “the dimensional pipe” and the AR capability of a smart phone.
a dimensional pipe prototype
About the demo case: The Adventure of The Unholy Man
This is the first and probably the most famous one in all the editions of the 221B Baker Street board games including the video game adaptation of the board game in 1987.
It is interesting the video game version and the board game version have different numbers of clues this case: – Longworth does not smoke. – Anastasia did not like the strange preacher.
https://styly.cc/ja/mobile/?location={ロケーションGUID}&size={印刷時のサイズ} https://styly.cc/ja/mobile/?location=d83f83ae-af5a-4480-bd3a-9fa6e40cf5fe&size=0.05 (5cm x 5cm)
Traveling between parallel universes has become more frequent in the recent years. Based on our research, the excessive and abnormal energy left behind of a jump between two universes attracts an exterritorial creature called YAMI who usually found in the void between dimensions. It was first discovered by our agent in Japan, hence the name, YAMI are generally not harmful to humans. However, the various energy they digested including ones that were from other universes might cause temporary imbalance which could lead to potential disasters. Your mission is to survey the area for YAMI and send them back to the void with our handheld device.
The alternative controller: This is inspired by one of my favorite handheld electronic game called Treasure Gausts (トレジャーガウスト) and I thought it will work nicely as an AR experience on smart phones. I built a quick demo which allows the player to follow and capture a YAMI.
Now I want more game mechanics than just tapping on the phone screen.
After a few quick sketches, I went on to Thingiverse to look for a smart phone mount. I started out by modifying jakejake’s Universal Phone Tripod Mount (https://www.thingiverse.com/thing:2423960). The design of this mount is brilliant, and it holds up pretty well. I then built out the rest of device piece by piece. I wanted some kind of switch at the bottom of this device in order for the player to “send YAMIs back to the void”, like an action that the player can do to initial the send back. This reminds me of the Tenketsu (天穴, Heavenly hole) in the anime Kekkaishi (結界師).
Tenketsu!!
I created a ring like contraption at the bottom of the grip. When a giest is weaken, the player pulls down the ring to initiate the interdimensional suction. For the rest of the inputs, I had originally wanted to use a Dual Button unit, but I found out they shared the same pin (GPIO36) with OP 90 unity on M5 FIRE.
Tenketsu On!! Made possible with the OP 90 unity from M5 and a 3D printed piece that functions like a on/off switch.
The other game mechanic that I wanted to add to the controller is spell casting. I want magic rings! I quickly prototype some wearable rings with RFID embedded. The player has to choose which ring to use during the capture.
GGG – I love this design処刑少女の生きる道 – saw a similar but more complex one recently in this show
Development notes:
Left or Right of the forward vector:
This is one of those topics that sounds pretty simple at first but it take some advanced vector math to figure it out. The original solution was found here written in C#:
The big idea is to modified this toy lamp toy into an alternative controller. There are four hexagon shaped LED covers on each side of the lamp. After a quick autopsy, these covers can be easily turned into touch buttons which are perfect for simulating the back and forth lamp rubbing actions. I will be using M5 Stack + MPR121(Touch Sensor Grove Platform Evaluation Expansion Board) + our HID Input Framework for xR to prototype this experience.
In order to be tracked in VR, I have to find a way to mount the touch controller on the lamp as well. After some rapid prototypes, I decided to mount the touch controller on top and M5 Stack on the bottom of the lamp. I also imagine the HTC VIVE tracker will be a great option for its compact form factor, but I try to keep the controller wireless.
I am working on the gameplay for the directional rubbing mechanic which allows the player to blow out (rub outward), suck in game objects (rub inward), or casting/summoning (rub back-and-forth).
The way we play digital games is secretly influenced by the advancement of the technology. While new technology inspired new play mechanics, obsolete technology also take away play mechanics we took for granted. One of the better known examples happened in early 2000 when TV technology transitioned from CRT (Cathode Ray Tube) to LCD. This advancement killed off the light gun genre in its entirety because the traditional light-gun technology requires CRT to position the light gun pointer on the TV screen. This tragic loss on mainstream consoles didn’t resolve till 2007 when Nintendo Wiimote came out.
The subject of this post is another example – barcode battler. When it comes to scanning linear barcodes, the card swiping action is the coolest! Recently, barcode related interactions are done with either a build-in camera or a hand-held barcode scanner. The card swiping action is gone!!
QRE1113 QRE1113 IR Reflective Photo Interrupter features an easy-to-use analog output, which will vary depending on the amount of IR light reflected back to the sensor. The QRE1113 is comprised of two parts – an IR emitting LED and an IR sensitive phototransistor. When you apply power to the VCC and GND pins the IR LED inside the sensor will illuminate. Because dark colors will bounce back less light, the sensor can be used to tell the difference between white and black areas and can be used in robots as a line follower.
I found out recently that both US and Japanese version of the Goseiger (天装戦隊ゴセイジャー) henshin toy Tensouder (テンソウダー) uses 2 QRE1113 to read the double decked barcode on the side.
Select the Unity Assets > Import Package > Custom Package menu command.
Locate the downloaded asset package and click Open.
The assets are imported into your project.
Every development and client computer must also install the Leap Motion service software (which runs automatically after it is installed).
Using Processing
You can use the Leap Motion Java libraries in a Processing Sketch (in Java mode). This involves adding the Leap Motion files to the Processing libraries folder and importing the Leap Motion classes into the Sketch.
Setting Up the Leap Motion Libraries
To put the Leap Motion Java libraries in the Processing libraries folder, do the following:
Locate and open your Sketchbook folder. (The path do this folder is listed in the Processing Preferences dialog.)
Find the folder named libraries in the Sketchbook folder, if it exists. Create the folder, if necessary.
Inside libraries, create a folder named, LeapJava.
Inside LeapJava, create a folder named, library.
5. Find your LeapSDK folder (wherever you copied it after downloading). 5. Copy the following 3 library files from LeapSDK/lib to LeapJava/library
If you prefer you can get the Ultraleap Hand Tracking Plugin for Unity using .unitypackage files. This can be helpful if you need to modify the package content. Please note that for future releases .unitypackage files will need to be updated manually.
Right-click in the Assets window, go to Import Package and left-click Custom Package.
Find the Tracking.unitypackage and import it. This includes Core, Interaction Engine, and the Hands Module.
Optionally import:
the Tracking Examples.unitypackage for example content
the Tracking Preview.unitypackage and Preview Examples.unitypackage for experimental content and examples. This can go through many changes before it is ready to be fully supported. At some point in the future, preview content might be promoted to the stable package, however it might also be deprecated instead. Because there is no guarantee for future support, we do not recommend using preview packages in production.
Girl Gun Lady is a Japanese live action sci-fi TV drama. I am particularly fascinated by the digital weapons designed for this drama. All of the weapons including the Gun Ladies are available in plastic model kits. I am having a blast building some of them. My favorite weapon is the Alpha Tango. It is the size of a hand pistol but functions like a grenade launcher and the grenade ammo can be programmed to do different things.
Girl Gun Lady Ver. Alpha Tango
Alpha Tango also reminds me of Maam’s Magic Bullet Gun (魔弾銃 まだんがン ) – Dragon Quest: Adventure of Dai which is another favorite sci-fi weapon of mine from childhood.
Picking the right bullet for the situation is an interesting game mechanic to explore. Judge Dredd’s Lawgiver is another fun(?) example. I did a voice-activated light gun project in early 2007 which was inspired by the Lawgiver in Sylvester Stallone’s Judge Dredd (1995).
This should be my next data relic. Meanwhile, did a quick study on Maam’s Magic Bullet Gun in Tinkercad.
Also modified the Oculus Quest 2 Controller Pistol Grip (https://www.thingiverse.com/thing:4760656) to work with M5Stack. This could be great for voice-activated weapon using Google Assistant. The grip file I downloaded directly from the Thingiverse doesn’t fit, I couldn’t push the grip all the way up like shown in the pictures. I used Tinkercad to make the hole bigger with a +1% scaled model of a Quest 2 controller. After that adjustment, it fits smoothly.
Going back to the voice inputs. My experiment with both Watson and Google Assistant shows that there is a significant delay on speech to text response. It gets worse with slow internet connection. I had a hard time demonstrate projects using speech to text (cloud) service in demo day event and conference in the past. What can be done in UX to make that passage of time felt shorter – less significant? Slow-motion? Well, there is only one way to find out.
I had done some experiments with their SSML, expressive SSML, and Voice Transformation SSML in 2017. It was an interesting way to change, almost like coding the voice in order to make it more human-like. Went back to IBM Watch Text to Speech demo today: https://www.ibm.com/demos/live/tts-demo/self-service/home, and found out it works differently now.
“Hurry up! Pick up the sword and defend yourself, your arrival has awakened the spores. It’s not a good thing.”
IBM Watson – LisaNatural Reader – FreeTTSReaderwideo – English (US) – Mike StevensGoogle Text-to-Speech – English (United States)/WaveNet/en-US-Wavenet-D/Headphones or earbudstypecast – Vanessa/Normal-A typecast – Keybo/fast
“Greetings, my name is Luke, the voice that you’re hearing right now is not my real voice. It is a result of my telepathic thoughts being synthesized by your auditory cortex. It could take form of any voices you have heard before.”