Writing OpenGL Programs in linix without desktops

Programming, for all ages and all languages.
Post Reply
psnix
Member
Member
Posts: 50
Joined: Fri Oct 24, 2008 12:34 pm

Writing OpenGL Programs in linix without desktops

Post by psnix »

i want to develop an embedded Linux on x86. how i can write OpenGL programs without any desktop ?(like as gnome or KDE)

i want to system, only run my app after booting and user can't run any other application. what can i do?

sorry for my bad en
User avatar
Combuster
Member
Member
Posts: 9301
Joined: Wed Oct 18, 2006 3:45 am
Libera.chat IRC: [com]buster
Location: On the balcony, where I can actually keep 1½m distance
Contact:

Re: Writing OpenGL Programs in linix without desktops

Post by Combuster »

I suggest you start with getting familiar with how the graphical system in linux works. It's not a good idea to go into high arcana without knowing the fundamentals.

That said, one possible way is to make your own desktop, look here for some clues: http://www.manpagez.com/man/1/xinit/
"Certainly avoid yourself. He is a newbie and might not realize it. You'll hate his code deeply a few years down the road." - Sortie
[ My OS ] [ VDisk/SFS ]
psnix
Member
Member
Posts: 50
Joined: Fri Oct 24, 2008 12:34 pm

Re: Writing OpenGL Programs in linix without desktops

Post by psnix »

I suggest you start with getting familiar with how the graphical system in linux works. It's not a good idea to go into high arcana without knowing the fundamentals.
thanks.

so i can do that :D

can u suggest me any book or tutorial?
User avatar
KotuxGuy
Member
Member
Posts: 96
Joined: Wed Nov 25, 2009 1:28 pm
Location: Somewhere within 10ft of my favorite chubby penguin!

Re: Writing OpenGL Programs in linix without desktops

Post by KotuxGuy »

First off:
  • You don't need a desktop environment(DE) to write OpenGL/SDL programs in the first place. Period. You just need X.
  • A DE is not needed to run any graphical application, as that's what X is for, unless that application depends on some library that the DE provides.
There, now we've gotten that out of the way...
  • Combuster gave a good link on Linux's graphical system(X). I suggest you study it(or type "man xinit" at a Linux terminal).
  • You could make your own DE, if you wanted, as doing so would give you greater flexibility. But if you just want to start a OpenGL game(say, BZFlag), you just add it(the command to start the game) to the .xinitrc file, in the user's home directory.
Hope that helps!
Give a man Linux, you feed the nearest optician ( Been staring at the PC too long again? ).
Give a man OS X, you feed the nearest NVidia outlet ( I need more GPU power!! )
Give a man Windows, you feed the entire Tylenol company ( Self explanatory :D )
psnix
Member
Member
Posts: 50
Joined: Fri Oct 24, 2008 12:34 pm

Re: Writing OpenGL Programs in linix without desktops

Post by psnix »

hi
i do following steps but i can see anything:

1. wrote an OpenGL application with name test
2. a shell script for running X server and opengl application with name "run.sh"
X :1&
./test
3. and then i go to tty1 (ALT+CTRL+1)
4. i execute script

but i can't see any thing on screen!

app code:

Code: Select all

#include<stdio.h>
#include<stdlib.h>
#include<X11/X.h>
#include<X11/Xlib.h>
#include<GL/gl.h>
#include<GL/glx.h>
#include<GL/glu.h>

Display                 *dpy;
Window                  root;
GLint                   att[] = { GLX_RGBA, GLX_DEPTH_SIZE, 24, GLX_DOUBLEBUFFER, None };
XVisualInfo             *vi;
Colormap                cmap;
XSetWindowAttributes    swa;
Window                  win;
GLXContext              glc;
XWindowAttributes       gwa;
XEvent                  xev;

void DrawAQuad() {
 glClearColor(1.0, 1.0, 1.0, 1.0);
 glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

 glMatrixMode(GL_PROJECTION);
 glLoadIdentity();
 glOrtho(-1., 1., -1., 1., 1., 20.);

 glMatrixMode(GL_MODELVIEW);
 glLoadIdentity();
 gluLookAt(0., 0., 10., 0., 0., 0., 0., 1., 0.);

 glBegin(GL_QUADS);
  glColor3f(1., 0., 0.); glVertex3f(-.75, -.75, 0.);
  glColor3f(0., 1., 0.); glVertex3f( .75, -.75, 0.);
  glColor3f(0., 0., 1.); glVertex3f( .75,  .75, 0.);
  glColor3f(1., 1., 0.); glVertex3f(-.75,  .75, 0.);
 glEnd(); } 
 
int main(int argc, char *argv[]) {

 dpy = XOpenDisplay(NULL);
 
 if(dpy == NULL) {
 	printf("\n\tcannot connect to X server\n\n");
        exit(0); }
        
 root = DefaultRootWindow(dpy);

 vi = glXChooseVisual(dpy, 0, att);

 if(vi == NULL) {
	printf("\n\tno appropriate visual found\n\n");
        exit(0); } 
 else {
	printf("\n\tvisual %p selected\n", vi->visualid); }/* %p creates hexadecimal output like in glxinfo */


 cmap = XCreateColormap(dpy, root, vi->visual, AllocNone);

 swa.colormap = cmap;
 swa.event_mask = ExposureMask | KeyPressMask;
 
 win = XCreateWindow(dpy, root, 0, 0, 600, 600, 0, vi->depth, InputOutput, vi->visual, CWColormap | CWEventMask, &swa);

 XMapWindow(dpy, win);
 XStoreName(dpy, win, "VERY SIMPLE APPLICATION");
 
 glc = glXCreateContext(dpy, vi, NULL, GL_TRUE);
 glXMakeCurrent(dpy, win, glc);
 
 glEnable(GL_DEPTH_TEST); 
 
 while(1) {
 	XNextEvent(dpy, &xev);
        
        if(xev.type == Expose) {
        	XGetWindowAttributes(dpy, win, &gwa);
                glViewport(0, 0, gwa.width, gwa.height);
        	DrawAQuad(); 
                glXSwapBuffers(dpy, win); }
                
	else if(xev.type == KeyPress) {
        	glXMakeCurrent(dpy, None, NULL);
 		glXDestroyContext(dpy, glc);
 		XDestroyWindow(dpy, win);
 		XCloseDisplay(dpy);
 		exit(0); }
       }
}

what is wrong?
Post Reply