I need to integrate an analog signal using my ARDUINO uno. I am sampling the input signal at 1KHz using a delay command in my void loop and adding up the values at the analog read at that specific time. Then I multiply my added output with 0.001s to get my integration.
Since, I am new to ARDUINO UNO, I want some review on the code to check whether I am correct or not. Kindly help, I'll be indebted.
const int a=13; //my output pin
const int c=A0; //my input pin
int d=0; //my summation variable
int e=0; //my integration variable(integration as a summation)
void setup()
{
pinMode(a, OUTPUT);
}
void loop()
{
int b=analogRead(c);
d=d+b; //my summation
e=d*10^-3; //my integration as a summation, by sampling my signal at 1ms
analogWrite(a,e); //writing my integration into my output pin
//print the results to the serial monitor
Serial.print("input=");
Serial.print(b);
Serial.print("\t output=");
Serial.println(e);
delay(1); //wait for 1ms as I want it sampled at that rate
}
int
,d
could overflow in as little as 32 measurements. Also analogRead does take some time, as does the code around it, so the time between measurements will be a bit more that 1 ms. I also don't see how you call dividing by 1000 integration. – Gerben Mar 18 at 11:14long int
, it will eventually overflow. If you store it in a float, it will loose precision as it grows and, eventually, the summation will have no effect, becaused+b
is exactly equal tod
ifd
is a big enough float. – Edgar Bonet Mar 19 at 12:22