I had done a few projects in the past using IR remote as a way of wireless communication. I used FLIRC USB (https://flirc.tv/more/flirc-usb). It is essentially a USB IR receiver that plugs into your computer and turns IR signals into specific keystrokes. The great thing about FLIRC is that once you configured it on a computer with its application, you can use it anywhere with a USB port. The only downside is that the FLIRC USB only offers six customizable inputs (up, down, left, right, enter, and back) because it is meant to be used for media playback. I just figured out a way to do it with M5Stack and the IR Unit now.
I am using the M5Stack Core + IR Unit (Port B), so have to change the pin to 36.
Upload it to M5Stack, get the readout (Code) from the serial monitor. Since I don’t need to recreate the IR signal, the Code here will work just fine.
Now, M5Stack can read the remote and react!
In order to work with result.value, I need a uint64_t to String helper method to make the comparison easier for me.
String u64ToS(uint64_t input) {
String result = "";
uint8_t base = 10;
do {
char c = input % base;
input /= base;
if (c < 10)
c +='0';
else
c += 'A' - 10;
result = c + result;
} while (input);
return result;
}
or
String print64(uint64_t n) {
char buf[21];
char *str = &buf[sizeof(buf) - 1];
String sdata = "";
*str = '\0';
do {
uint64_t m = n;
n /= 10;
*--str = m - 10 * n + '0';
} while (n);
sdata += str;
return sdata;
}
Two weeks later – I finally got my New User status. It was a little frustrating at the beginning. Based on the internet, I need 10 hours of gameplay to get to the New User status which allows me to upload Avatars and Worlds to the lab community. I did that but nothing happened. A kind player in VRChat told me that I need to make friends too. Oh, why I didn’t think of that. After I made one friend, the next time I logged in, I was promoted to New User. And I got this in the e-mail.
I brought the Gucci model into Maya and separate the shoes from the model they came with and put them on my VRChat avatar.
Install Unity 2018.4.20f1 that is recommended by VRChat.
Upload my Avatar by following this tutorial – in 5 minutes. https://vrchat.fandom.com/wiki/Quick_Start_-_Mixamo_Avatar_Creation I believe this was written for SDK2 but works for SDK3, especially when selecting the Animator Controller. I use the T-pose one from SDK3 and it works fine. I was able to upload, however, my sitting and ducking animations were messed up…
To make it work for both Quest and PC, I have to upload the avatar twice. Once in Windows mode and the other in Android mode. The official tutorial online recommends duplicating the project.
Just in a few hours, I am going to participate in the graduation ceremony for our Class 2021 cohort. I worked with a good portion of them extensively in their first year of school in virtual reality and playful experience design. I wanted to make something to remember them by. It is purely impulsive and based on no research or data. It is very last minute, but I decided to celebrate this moment with a new exit act.
After we moved the learning online after the pandemic last year, I have been experiment with different online identities. From animated characters with face tracking to cosmetic filters, it was an unexpected adventure into a world of digital tools that I didn’t care much for. During those experiments, I created a thing called exit acts – exit zoom meetings with style. I created one for taking off in a Gundam RX-78 and another one for flying away with Spike’s Swordfish II from Cowboy Bebop. Jamming it out with an EVA – 3 from Neo Genesis Evangelion seems like a reasonable next project.
My teaching assistant, Sharon, told me that the overall visual scheme of their thesis website was inspired by the color scheme of the animation -Evangelion. Time to collect some assets and put together a theme. I want to use a cityscape silhouette to create a high contrast between foreground and background similar to the technique used in some of the Fortnite banners.
The USB HID (Human Interface Device) specification allows us to build plug-and-play alternative controllers for a variety of devices without the need to install additional drivers. In this test, M5Stack Go is disguised as a normal HID Keyboard. Instead of serial communication, M5Stack GO will process the sensor data locally and send a correspondent keystroke defined by the designer. Keystrokes are like digital switches, so HID Keyboard can’t send over analog data.
Tested and worked with STYLY in Unity, on the Web, in AR (Mobile), in Oculus Quest (standalone), and on PCVR (Oculus Link or Air Link).
PCVR and WEB TEST SCENE Connect the controller to your computer, go to the following page and add it to your list. PCVR users find the scene in My List in STYLY app. Web users can just open it in the browser. https://gallery.styly.cc/scene/871fdff9-475c-4224-af41-9e8afba32922
Quest (Portable) Connect the controller to your headset, go to the link above and add it to your list. You can find it in My List in your STYLY app.
AR TEST SCENE Connect the controller to your mobile phone and scan the QR code with the STYLY app.
Advanced Topic: Would it be possible to read bleKeyboard.print() all at once as a string. If so, I can send over analog data as a string without having to scale it down to letters.
The biggest difference with HID Gamepad is the ability to send over joystick information. A joystick is made of 2 axes (usually 10K potentiometers), so with left and right joysticks and the two shoulder buttons, HID Gamepad is capable of communicating 6 analog data over in the format of axes by default. It’s worth mentioning that HID mouse is also capable of communicating over 1 analog value over its scroll wheel.
Combine IndividualAxes and PotAsAxis example in ESP32-BLE-Gamepad and JOYSTICK example in M5Stack/Units.
After some initial testing, the old input system in Unity editor can’t recognize the ESP32 BLE Gamepad. When connected through Bluetooth, it doesn’t show up as a gamepad as the TEST 3 did. The new input system recognizes it but the action needed to access the inputs is a custom one which is not included in Playmaker’s standard actions.
This test is based on a custom library put together by Shigeru Kobayashi. This library allows the ESP32 gamepad to be discovered by both old and new input systems in Unity 3D as the default Axes X and Y and buttons 0 to button 4.
Set network parameter and your account information in network_param.h.
Say to the microphone and see the serial monitor.
Test this in the command prompt (Windows 10): curl -s -X POST -H "Content-Type: application/json" -d @request.json "https://speech.googleapis.com/v1/speech:recognize?key= API_KEY"
Make sure the request.json file is in the same directory in which the curl command is executed.
In order to get this sketch to run on M5Stack, I have to set up OPENSSL on my Windows 10. Here is a recent tutorial: https://tecadmin.net/install-openssl-on-windows/ One confusing thing in this tutorial is the directory name. I downloaded the Winx64 version as instructed, so my directory should be C:\OpenSSL-Win64 not C:\OpenSSL-Win32 as shown in the tutorial. The environmental variables should be like this:
set OPENSSL_CONF=C:\OpenSSL-Win64\bin\openssl.cfg
set Path=......Other Values here......;C:\OpenSSL-Win64\bin
Don’t forget to restart the computer afterward.
After generating the certificate with OpenSSL in the command prompt, copy and replace the value of root_ca. Make sure to keep the same exact format.
One last thing that caused an error callback for me was the name of a property in the HttpBody1 was wrong. It should be “sampleRateHertz\” not “sampleRate\”. While I was there, I also changed “languageCode\”:\”ja-JP\” to “languageCode\”:\”en-US\”.
I was able to get it to work after all these changes. I said “Hello Hello” and this came back in the Serial Monitor, pretty satisfying!
I realized the client.read() returns more than just the JSON data, there is a huge header too. After some research online, didn’t find anything useful on how to get rid of the header, so I went for the old-school way -> remove(); After cleaning up extra strings both before and after the data, it works beautifully with the ArduinoJSON library.
Helps to calculate the size of the data and also shows how to access specific data in the JSON tree. I was having problems understanding the examples in the library but this tool did it for me : https://arduinojson.org/v6/assistant/
To get this code to work, I made a total of 48 requests to the Google Cloud Platform and used 2.75 minutes. I think as an indie experimentalist, 60 minutes/month for free is a really good deal.