Robby
2010-02-01 01:52:01 UTC
Hello,
Can someone tell me as to why I am getting a divide by zero error in the
following fragment of code.
===========================================main.c
#include <stdio.h>
#define SYS_OSC 10500000 // Current system clock 10.5 MHZ
#define MICRO_SEC 0.000001 // 1 micro second
#define OSC_PERIOD (1/SYS_OSC) // 1 second divide by system clock
#define OVERHEAD_IC 58 // 58 overhead instructions
#define DELAY_LOOP_IC 7 // 7 instructions per while loop
iteration
#define RATIO(a,b,c) (a/(b*c)) // a, b, c macro parameters
int main()
{
unsigned int us_delay;
float x;
x = (float)RATIO((float)MICRO_SEC, (float)OSC_PERIOD, (float)DELAY_LOOP_IC)
* (float)us_delay;
return 0;
}
===============================================
I don't think I am dividing by zero !
b*c = 0.665 us
and
a = 1us
so this is what I expect no?
1us/0.665us = 1.503759
Unless the compiler sees 0.000000665 as a "0" ?????
Confused?
All help is sincerely appreciated. Thanks!
Can someone tell me as to why I am getting a divide by zero error in the
following fragment of code.
===========================================main.c
#include <stdio.h>
#define SYS_OSC 10500000 // Current system clock 10.5 MHZ
#define MICRO_SEC 0.000001 // 1 micro second
#define OSC_PERIOD (1/SYS_OSC) // 1 second divide by system clock
#define OVERHEAD_IC 58 // 58 overhead instructions
#define DELAY_LOOP_IC 7 // 7 instructions per while loop
iteration
#define RATIO(a,b,c) (a/(b*c)) // a, b, c macro parameters
int main()
{
unsigned int us_delay;
float x;
x = (float)RATIO((float)MICRO_SEC, (float)OSC_PERIOD, (float)DELAY_LOOP_IC)
* (float)us_delay;
return 0;
}
===============================================
I don't think I am dividing by zero !
b*c = 0.665 us
and
a = 1us
so this is what I expect no?
1us/0.665us = 1.503759
Unless the compiler sees 0.000000665 as a "0" ?????
Confused?
All help is sincerely appreciated. Thanks!
--
Best regards
Rob
Best regards
Rob