AstroFusion
Made by jhli2 and Anya Singhal · UNLISTED (SHOWN IN POOLS)
Made by jhli2 and Anya Singhal · UNLISTED (SHOWN IN POOLS)
AstroFusion will translate not only give you your Western and Vedic horoscopes, it’ll give you a prediction about your life too! Comparing your two horoscopes based on your birth date, you tell AstroFusion what you want to hear about yourself. Is it love? Your career? Maybe health? AstroFusion combines your horoscope across two cultures and finds the differences and similarities, to try to offer a prediction about your life that you want to hear.
Created: March 24th, 2023
The intent of this project is not only to educate the user about different types of astrology culturally, but to make them question what they believe in. Western astrology has become a part of Gen Z fads, which in turn have started to make its ties to reality more obscure. Buzzfeed quizzes about which Harry Potter character you are or which signs should be together make the supposed science behind it harder to see. The goals was that by giving the user 2 different types of predictions from their Western and Hindu horoscope, it might display some shocking or even spooky similarities and differences that cause them to question (or perhaps strengthen) their beliefs.
Both of us had an initial interest in astrology, but more from a critical lens. The initial idea was a GenZ version of the magic 8 ball, almost as a statement of people desperation to find answers in meaningless devices to unanswerable questions. The biggest influence was my parent's beliefs, that I've somewhat found myself subconsciously believing in. They often consult a pundit (guru) for predictions relating to me and my sisters, and many of them have come true. It made me start to wonder if these predictions had any connection to Western astrology. If we could use the two modes of astrology as a film over one another, could it uncover striking similarities and differences that could bring us closer to some "truth"?
The final prototype is an acrylic box engraved with symbols and images from western astrology, with a BLE Sense color sensor, keypad, and thermal printer. The BLE Sense uses a ML library from Edge Impulse to determine three different inputs, each of which leads to a different prediction for the user. The user shows the device either a red (love), green (career), or blue (health) card for their prediction. The keypad is intended to take in the birth date of the user, which is used to allow the device to assign both a western and vedic horoscope to the user. Finally, the printer prints their two horoscopes, as well as the prediction that they asked for.
The final prototype had some technical difficulties that prevented the device from being as fully functional as intended. Firstly, the keypad did not work as intended. Instead, it would send random inputs to the printer, resulting in the printer being triggered at random times. Additionally, we had issues with fully integrating the OpenAI platform with the device. Finally, we had some oversights when designing the physical artifact, which resulted in more of the final project sitting outside of the casing than initially intended.
6.600 KB · Download / View
/* Edge Impulse ingestion SDK
* Copyright (c) 2022 EdgeImpulse Inc.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*
*/
/* Includes ---------------------------------------------------------------- */
#include <project-2_inferencing.h>
#include <Arduino_LSM9DS1.h> //Click here to get the library: http://librarymanager/All#Arduino_LSM9DS1
#include <Arduino_LPS22HB.h> //Click here to get the library: http://librarymanager/All#Arduino_LPS22HB
#include <Arduino_HTS221.h> //Click here to get the library: http://librarymanager/All#Arduino_HTS221
#include <Arduino_APDS9960.h> //Click here to get the library: http://librarymanager/All#Arduino_APDS9960
enum sensor_status {
NOT_USED = -1,
NOT_INIT,
INIT,
SAMPLED
};
/** Struct to link sensor axis name to sensor value function */
typedef struct{
const char *name;
float *value;
uint8_t (*poll_sensor)(void);
bool (*init_sensor)(void);
sensor_status status;
} eiSensors;
/* Constant defines -------------------------------------------------------- */
#define CONVERT_G_TO_MS2 9.80665f
#define MAX_ACCEPTED_RANGE 2.0f // starting 03/2022, models are generated setting range to +-2,
// but this example use Arudino library which set range to +-4g.
// If you are using an older model, ignore this value and use 4.0f instead
/** Number sensor axes used */
#define N_SENSORS 18
/* Forward declarations ------------------------------------------------------- */
float ei_get_sign(float number);
bool init_IMU(void);
bool init_HTS(void);
bool init_BARO(void);
bool init_APDS(void);
uint8_t poll_acc(void);
uint8_t poll_gyr(void);
uint8_t poll_mag(void);
uint8_t poll_HTS(void);
uint8_t poll_BARO(void);
uint8_t poll_APDS_color(void);
uint8_t poll_APDS_proximity(void);
uint8_t poll_APDS_gesture(void);
/* Private variables ------------------------------------------------------- */
static const bool debug_nn = false; // Set this to true to see e.g. features generated from the raw signal
static float data[N_SENSORS];
static bool ei_connect_fusion_list(const char *input_list);
static int8_t fusion_sensors[N_SENSORS];
static int fusion_ix = 0;
/** Used sensors value function connected to label name */
eiSensors sensors[] =
{
"accX", &data[0], &poll_acc, &init_IMU, NOT_USED,
"accY", &data[1], &poll_acc, &init_IMU, NOT_USED,
"accZ", &data[2], &poll_acc, &init_IMU, NOT_USED,
"gyrX", &data[3], &poll_gyr, &init_IMU, NOT_USED,
"gyrY", &data[4], &poll_gyr, &init_IMU, NOT_USED,
"gyrZ", &data[5], &poll_gyr, &init_IMU, NOT_USED,
"magX", &data[6], &poll_mag, &init_IMU, NOT_USED,
"magY", &data[7], &poll_mag, &init_IMU, NOT_USED,
"magZ", &data[8], &poll_mag, &init_IMU, NOT_USED,
"temperature", &data[9], &poll_HTS, &init_HTS, NOT_USED,
"humidity", &data[10], &poll_HTS, &init_HTS, NOT_USED,
"pressure", &data[11], &poll_BARO, &init_BARO, NOT_USED,
"red", &data[12], &poll_APDS_color, &init_APDS, NOT_USED,
"green", &data[13], &poll_APDS_color, &init_APDS, NOT_USED,
"blue", &data[14], &poll_APDS_color, &init_APDS, NOT_USED,
"brightness", &data[15], &poll_APDS_color, &init_APDS, NOT_USED,
"proximity", &data[16], &poll_APDS_proximity, &init_APDS, NOT_USED,
"gesture", &data[17], &poll_APDS_gesture,&init_APDS, NOT_USED,
};
/**
* @brief Arduino setup function
*/
#include <Adafruit_Thermal.h>
#include <Keypad.h>
//#define RX_PIN 6
//#define TX_PIN 7
#define BAUD_RATE 9600
// initialize the keypad
const byte ROWS = 4;
const byte COLS = 3;
char keys[ROWS][COLS] = {
{'1', '2', '3'},
{'4', '5', '6'},
{'7', '8', '9'},
{'*', '0', '#'}
};
byte rowPins[ROWS] = {9, 8, 7, 6};
byte colPins[COLS] = {5, 4, 3};
Keypad keypad = Keypad(makeKeymap(keys), rowPins, colPins, ROWS, COLS);
// initialize the thermal printer
Adafruit_Thermal printer(&Serial1);
void setup()
{
/* Init serial */
Serial.begin(115200);
// comment out the below line to cancel the wait for USB connection (needed for native USB)
while (!Serial);
Serial.println("Edge Impulse Sensor Fusion Inference\r\n");
/* Connect used sensors */
if(ei_connect_fusion_list(EI_CLASSIFIER_FUSION_AXES_STRING) == false) {
ei_printf("ERR: Errors in sensor list detected\r\n");
return;
}
/* Init & start sensors */
for(int i = 0; i < fusion_ix; i++) {
if (sensors[fusion_sensors[i]].status == NOT_INIT) {
sensors[fusion_sensors[i]].status = (sensor_status)sensors[fusion_sensors[i]].init_sensor();
if (!sensors[fusion_sensors[i]].status) {
ei_printf("%s axis sensor initialization failed.\r\n", sensors[fusion_sensors[i]].name);
}
else {
ei_printf("%s axis sensor initialization successful.\r\n", sensors[fusion_sensors[i]].name);
}
}
}
/**
* @brief Get data and run inferencing
*/
//Serial.begin(BAUD_RATE);
Serial1.begin(BAUD_RATE);
printer.begin();
}
int thing = 0;
void loop()
{
ei_printf("\nStarting inferencing in 2 seconds...\r\n");
delay(2000);
if (EI_CLASSIFIER_RAW_SAMPLES_PER_FRAME != fusion_ix) {
ei_printf("ERR: Sensors don't match the sensors required in the model\r\n"
"Following sensors are required: %s\r\n", EI_CLASSIFIER_FUSION_AXES_STRING);
return;
}
ei_printf("Sampling...\r\n");
// Allocate a buffer here for the values we'll read from the sensor
float buffer[EI_CLASSIFIER_DSP_INPUT_FRAME_SIZE] = { 0 };
for (size_t ix = 0; ix < EI_CLASSIFIER_DSP_INPUT_FRAME_SIZE; ix += EI_CLASSIFIER_RAW_SAMPLES_PER_FRAME) {
// Determine the next tick (and then sleep later)
int64_t next_tick = (int64_t)micros() + ((int64_t)EI_CLASSIFIER_INTERVAL_MS * 1000);
for(int i = 0; i < fusion_ix; i++) {
if (sensors[fusion_sensors[i]].status == INIT) {
sensors[fusion_sensors[i]].poll_sensor();
sensors[fusion_sensors[i]].status = SAMPLED;
}
if (sensors[fusion_sensors[i]].status == SAMPLED) {
buffer[ix + i] = *sensors[fusion_sensors[i]].value;
sensors[fusion_sensors[i]].status = INIT;
}
}
int64_t wait_time = next_tick - (int64_t)micros();
if(wait_time > 0) {
delayMicroseconds(wait_time);
}
}
// Turn the raw buffer in a signal which we can the classify
signal_t signal;
int err = numpy::signal_from_buffer(buffer, EI_CLASSIFIER_DSP_INPUT_FRAME_SIZE, &signal);
if (err != 0) {
ei_printf("ERR:(%d)\r\n", err);
return;
}
// Run the classifier
ei_impulse_result_t result = { 0 };
err = run_classifier(&signal, &result, debug_nn);
if (err != EI_IMPULSE_OK) {
ei_printf("ERR:(%d)\r\n", err);
return;
}
// print the predictions
ei_printf("Predictions (DSP: %d ms., Classification: %d ms., Anomaly: %d ms.):\r\n",
result.timing.dsp, result.timing.classification, result.timing.anomaly);
for (size_t ix = 0; ix < EI_CLASSIFIER_LABEL_COUNT; ix++) {
ei_printf("%s: %.5f\r\n", result.classification[ix].label, result.classification[ix].value);
}
#if EI_CLASSIFIER_HAS_ANOMALY == 1
ei_printf(" anomaly score: %.3f\r\n", result.anomaly);
#endif
if(result.classification[0].value >= 0.6) {
thing = 1;
printer.println("Health prediction: Prioritizing self-care and a healthy lifestyle will lead to improved physical and mental well-being. Be mindful of stress and take steps to manage it for optimal health.");
}
else if(result.classification[1].value >= 0.6) {
thing = 2;
printer.println("Career prediction: Your hard work and dedication will pay off soon, leading to a major achievement or recognition in your chosen field. Keep pushing forward with confidence.");
}
else {
thing = 3;
printer.println("Love life prediction: Your romantic life will experience a surge of passion and intensity in the near future, leading to a deep connection with a soulmate.");
}
p1234();
}
#if !defined(EI_CLASSIFIER_SENSOR) || (EI_CLASSIFIER_SENSOR != EI_CLASSIFIER_SENSOR_FUSION && EI_CLASSIFIER_SENSOR != EI_CLASSIFIER_SENSOR_ACCELEROMETER)
#error "Invalid model for current sensor"
#endif
/**
* @brief Go through sensor list to find matching axis name
*
* @param axis_name
* @return int8_t index in sensor list, -1 if axis name is not found
*/
static int8_t ei_find_axis(char *axis_name)
{
int ix;
for(ix = 0; ix < N_SENSORS; ix++) {
if(strstr(axis_name, sensors[ix].name)) {
return ix;
}
}
return -1;
}
/**
* @brief Check if requested input list is valid sensor fusion, create sensor buffer
*
* @param[in] input_list Axes list to sample (ie. "accX + gyrY + magZ")
* @retval false if invalid sensor_list
*/
static bool ei_connect_fusion_list(const char *input_list)
{
char *buff;
bool is_fusion = false;
/* Copy const string in heap mem */
char *input_string = (char *)ei_malloc(strlen(input_list) + 1);
if (input_string == NULL) {
return false;
}
memset(input_string, 0, strlen(input_list) + 1);
strncpy(input_string, input_list, strlen(input_list));
/* Clear fusion sensor list */
memset(fusion_sensors, 0, N_SENSORS);
fusion_ix = 0;
buff = strtok(input_string, "+");
while (buff != NULL) { /* Run through buffer */
int8_t found_axis = 0;
is_fusion = false;
found_axis = ei_find_axis(buff);
if(found_axis >= 0) {
if(fusion_ix < N_SENSORS) {
fusion_sensors[fusion_ix++] = found_axis;
sensors[found_axis].status = NOT_INIT;
}
is_fusion = true;
}
buff = strtok(NULL, "+ ");
}
ei_free(input_string);
return is_fusion;
}
/**
* @brief Return the sign of the number
*
* @param number
* @return int 1 if positive (or 0) -1 if negative
*/
float ei_get_sign(float number) {
return (number >= 0.0) ? 1.0 : -1.0;
}
bool init_IMU(void) {
static bool init_status = false;
if (!init_status) {
init_status = IMU.begin();
}
return init_status;
}
bool init_HTS(void) {
static bool init_status = false;
if (!init_status) {
init_status = HTS.begin();
}
return init_status;
}
bool init_BARO(void) {
static bool init_status = false;
if (!init_status) {
init_status = BARO.begin();
}
return init_status;
}
bool init_APDS(void) {
static bool init_status = false;
if (!init_status) {
init_status = APDS.begin();
}
return init_status;
}
uint8_t poll_acc(void) {
if (IMU.accelerationAvailable()) {
IMU.readAcceleration(data[0], data[1], data[2]);
for (int i = 0; i < 3; i++) {
if (fabs(data[i]) > MAX_ACCEPTED_RANGE) {
data[i] = ei_get_sign(data[i]) * MAX_ACCEPTED_RANGE;
}
}
data[0] *= CONVERT_G_TO_MS2;
data[1] *= CONVERT_G_TO_MS2;
data[2] *= CONVERT_G_TO_MS2;
}
return 0;
}
uint8_t poll_gyr(void) {
if (IMU.gyroscopeAvailable()) {
IMU.readGyroscope(data[3], data[4], data[5]);
}
return 0;
}
uint8_t poll_mag(void) {
if (IMU.magneticFieldAvailable()) {
IMU.readMagneticField(data[6], data[7], data[8]);
}
return 0;
}
uint8_t poll_HTS(void) {
data[9] = HTS.readTemperature();
data[10] = HTS.readHumidity();
return 0;
}
uint8_t poll_BARO(void) {
data[11] = BARO.readPressure(); // (PSI/MILLIBAR/KILOPASCAL) default kPa
return 0;
}
uint8_t poll_APDS_color(void) {
int temp_data[4];
if (APDS.colorAvailable()) {
APDS.readColor(temp_data[0], temp_data[1], temp_data[2], temp_data[3]);
data[12] = temp_data[0];
data[13] = temp_data[1];
data[14] = temp_data[2];
data[15] = temp_data[3];
}
}
uint8_t poll_APDS_proximity(void) {
if (APDS.proximityAvailable()) {
data[16] = (float)APDS.readProximity();
}
return 0;
}
uint8_t poll_APDS_gesture(void) {
if (APDS.gestureAvailable()) {
data[17] = (float)APDS.readGesture();
}
return 0;
}
void p1234(){
char input[5];
int i = 0;
char key = NULL;
while (i < 4) {
key = keypad.getKey();
if (key) {
input[i] = key;
i++;
printer.write(key);
}
}
input[4] = '\0'; // null terminate the input string
printer.println();
// parse the input string to get the month and day
int month = (input[0] - '0') * 10 + (input[1] - '0');
int day = (input[2] - '0') * 10 + (input[3] - '0');
if (month > 12 || day > 28) {
month = 3;
day = 4;
}
char* WZ = NULL;
// print the Western zodiac sign based on the month and day
if ((month == 3 && day >= 21) || (month == 4 && day <= 19)) {
printer.println("Western Zodiac: Aries");
WZ = "Aries";
} else if ((month == 4 && day >= 20) || (month == 5 && day <= 20)) {
printer.println("Western Zodiac: Taurus");
WZ = "Taurus";
} else if ((month == 5 && day >= 21) || (month == 6 && day <= 20)) {
printer.println("Western Zodiac: Gemini");
WZ = "Gemini";
} else if ((month == 6 && day >= 21) || (month == 7 && day <= 22)) {
printer.println("Western Zodiac: Cancer");
WZ = "Cancer";
} else if ((month == 7 && day >= 23) || (month == 8 && day <= 22)) {
printer.println("Western Zodiac: Leo");
WZ = "Leo";
} else if ((month == 8 && day >= 23) || (month == 9 && day <= 22)) {
printer.println("Western Zodiac: Virgo");
WZ = "Virgo";
} else if ((month == 9 && day >= 23) || (month == 10 && day <= 22)) {
printer.println("Western Zodiac: Libra");
WZ = "Libra";
} else if ((month == 10 && day >= 23) || (month == 11 && day <= 21)) {
printer.println("Western Zodiac: Scorpio");
WZ = "Scorpio";
} else if ((month == 11 && day >= 22) || (month == 12 && day <= 21)) {
printer.println("Western Zodiac: Sagittarius");
WZ = "Sagittarius";
} else if ((month == 12 && day>= 22) || (month == 1 && day <= 19)) {
printer.println("Western Zodiac: Capricorn");
WZ = "Capricorn";
} else if ((month == 1 && day >= 20) || (month == 2 && day <= 18)) {
printer.println("Western Zodiac: Aquarius");
WZ = "Aquarius";
} else if ((month == 2 && day >= 19) || (month == 3 && day <= 20)) {
printer.println("Western Zodiac: Pisces");
WZ = "Pisces";
}
char* VS = NULL;
// print the Vedic sign based on the month and day
if ((month == 4 && day >= 13) || (month == 5 && day <= 14)) {
printer.println("Vedic Sign: Mesha");
VS = "Mesha";
} else if ((month == 5 && day >= 15) || (month == 6 && day <= 14)) {
printer.println("Vedic Sign: Vrishaba");
VS = "Vrishaba";
} else if ((month == 6 && day >= 15) || (month == 7 && day <= 14)) {
printer.println("Vedic Sign: Mithuna");
VS = "Mithuna";
} else if ((month == 7 && day >= 15) || (month == 8 && day <= 14)) {
printer.println("Vedic Sign: Karkata");
VS = "Karkata";
} else if ((month == 8 && day >= 15) || (month == 9 && day <= 15)) {
printer.println("Vedic Sign: Simha");
VS = "Simha";
} else if ((month == 9 && day >= 16) || (month == 10 && day <= 15)) {
printer.println("Vedic Sign: Kanya");
VS = "Kanya";
} else if ((month == 10 && day >= 16) || (month == 11 && day <= 14)) {
printer.println("Vedic Sign: Tula");
VS = "Tula";
} else if ((month == 11 && day >= 15) || (month == 12 && day <= 14)) {
printer.println("Vedic Sign: Vrishchika");
VS = "Vrishchika";
} else if ((month == 12 && day >= 15) || (month == 1 && day <= 13)) {
printer.println("Vedic Sign: Dhanus");
VS = "Dhanus";
} else if ((month == 1 && day >= 14) || (month == 2 && day <= 11)) {
printer.println("Vedic Sign: Makara");
VS = "Makara";
} else if ((month == 2 && day >= 12) || (month == 3 && day <= 12)) {
printer.println("Vedic Sign: Kumbha");
VS = "Kumbha";
} else if ((month == 3 && day >= 13) || (month == 4 && day <= 12)) {
printer.println("Vedic Sign: Meena");
VS = "Meena";
}
// point out key differences between Western and Vedic signs
printer.println();
//printer.println("Key Differences:");
//printer.println("- The Western zodiac is based on the position of the sun relative to the constellations, while the Vedic zodiac is based on the position of the moon relative to the constellations.");
//printer.println();
printer.println("Next horoscope:");
}
Click to Expand
The learning curve for the ML models and OpenAI integration was a bit rough, and still had some difficulties in the final prototype. The process of developing the ML model was fairly simple, but there were some difficulties in integrating the library with the overall code, as the sensor seemed to become less consistent when integrated. The printer was fairly straightforward to integrate, we were able to understand that portion right as it was introduced.
Project log 3 began to get into more of what the physical appearance of the final product would look like, with log 4 having a full model of the final box. Initially, we had a different vision for what the physical artifact and interaction would look like, where the user would present a disk of colors that they created, from which the machine would generate a prediction. We later pivoted to only having the device read simple colors, as we had a shift in concept.
Apart from the technical challenges that we faced, the conceptual feedback for the project focused around how to make the project feel more spooky. One question was about how the predictions might become more spooky in the ways that they relate both the western and vedic horoscopes.
One interesting question that came up was how can a device like AstroFusion make someone create an interest in astrology for someone who isn’t invested? Astrology is a topic that can be polarizing, but one comment during the review was about how AstroFusion can begin to make these differences in cultural astrological beliefs more interesting.
One piece of feedback that we received regarding the physical device was having the color cards go into a slot, rather than being slid over top of a sensor. Another comment was to consider the physical context of the device, and how one might come across it. This also tied into a comment about having more instructions along with the device, as it is a bit difficult to understand without explanation.
Aside from receiving horoscope-specific from ChatGPT, we achieved most of our technical objectives. We were able to read in a color, give the person's western and vedic sign, and give them a color-related prediction. However, I don't think we fully created the impact on the user that we wanted. The main issue was that the keypads were so sensitive, they couldn't take in an actual typed birthday because they read a bunch of other numbers in between. This meant the user could not really interact with the device and feel "spooked" at how much the device could predict about them. Although our housing for the device was quite pretty/spooky (in my opinion), we weren't actually able get the components in the housing to create clean and mysterious result. I think the project was received quite well, and interesting to the users because they were learning something new (their vedic sign), but those last few fixes would definitely be needed to fully satisfy our aspirations.
Reference any sources or materials used in the documentation or composition.
For this project, a lot of research was needed! Including all of the class resources (lecture slides and notion pages), we used the following websites:
https://www.mindbodygreen.com/articles/vedic-astrology-101
https://www.moondanceastrology.com/vedic-astrology-chart-calculator
https://nypost.com/article/what-is-vedic-astrology/
This project is only listed in this pool. Be considerate and think twice before sharing.
AstroFusion will translate not only give you your Western and Vedic horoscopes, it’ll give you a prediction about your life too! Comparing your two horoscopes based on your birth date, you tell AstroFusion what you want to hear about yourself. Is it love? Your career? Maybe health?
AstroFusion combines your horoscope across two cultures and finds the differences and similarities, to try to offer a prediction about your life that you want to hear.