Page 1 of 1

do I need delays always in FDD code

Posted: Sun Jul 16, 2006 5:34 pm
by earlz
In my FDD code I use timeout code, like this:

Code: Select all

unsigned char FDD_ReadyGet(){
	unsigned int delay;
	delay=timer_ticks+2000; //2second timeout
	while(delay>timer_ticks){
          if((inportb(MAIN_STAT)&0xC0)==0xC0){return 1;}
	}
	return 0;
}
I don't need to insert a wait() in there do I like, the Main Status Register is never out of date is it?

and for other ports do I need a wait between in/outport to give it time?

Posted: Mon Jul 17, 2006 1:24 pm
by bubach
Most people programming in C use these, not sure wheter they really are from Intel or not:

Code: Select all

/* sendbyte() routine from intel manual */
void sendbyte(int byte)
{
   volatile int msr;
   int tmo;
   
   for (tmo = 0;tmo < 128;tmo++) {
      msr = inportb(FDC_MSR);
      if ((msr & 0xc0) == 0x80) {
	 outportb(FDC_DATA,byte);
	 return;
      }
      inportb(0x80);   /* delay */
   }
}

/* getbyte() routine from intel manual */
int getbyte()
{
   volatile int msr;
   int tmo;
   
   for (tmo = 0;tmo < 128;tmo++) {
      msr = inportb(FDC_MSR);
      if ((msr & 0xd0) == 0xd0) {
	 return inportb(FDC_DATA);
      }
      inportb(0x80);   /* delay */
   }

   return -1;   /* read timeout */
}
HTH, Christoffer