hybrid-graphics-linux team mailing list archive
-
hybrid-graphics-linux team
-
Mailing list archive
-
Message #01015
Re: some hybrid linux hack (rather dirty)
-
To:
hybrid-graphics-linux@xxxxxxxxxxxxxxxxxxx
-
From:
Joakim Gebart <joakim.gebart@xxxxxx>
-
Date:
Fri, 06 May 2011 11:28:59 +0200
-
In-reply-to:
<1304260229.2091.0.camel@joaquin-Vostro-3500>
-
User-agent:
Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.17) Gecko/20110430 Lightning/1.0b3pre Thunderbird/3.1.10
I've been using this with great success the last couple of days. On my
i3 2.4GHz I can run it at 60fps (the usleep in the code changed to
16000) with around 30% cpu usage on a single core.
In my opinion the other method using virtualgl and passing data as jpeg
is not the optimal solution to the optimus problem and feels a bit too
complex. Using the quick and dirty solution, sharing an RGB buffer
between two servers, should be able to get better framerate after some
optimizations since we don't need to do any encoding of the data.
For comparison: I get around 4000fps in glxgears on my GF GT310 using
this method, although I only display 60fps on the intel card.
I've been playing with the idea of running the windump program as a root
window on the intel X display and then running Gnome or whatever on only
the nvidia display. Another thing I've been wanting to test is to use
the intel card's opengl for rendering and putting the XShm image in a
texture. Also syncing with the intel card's refresh rate should be easy
using opengl.
Florian: You didn't mention any license on the code you posted, would
you like to release it under the GPL?
I'm interested in working some more on this program, but I want to be
able to release my modifications if I manage to achieve anything. Like
markc said, github is excellent for this kind of project.
Best regards,
Joakim
2011-05-01 16:30, Joaquín Ignacio Aramendía skrev:
> El dom, 01-05-2011 a las 12:13 +1000, Pigeon escribió:
>> Hi Florian,
>>
>>> i recently bought a laptop (msi cx640) using nvidia optimus
>>> unknowingly, that it was pretty unusable to me on linux.
>>> however i found a (rather dirty) hack to be able to at least use my
>>> nvidia graphics card, so that i dont have to throw my laptop into the
>>> trashbin.
>> I've been experimenting the same kind of approach by using vnc
>> (x11vnc + vncviewer). I'm running two X servers on the same host, one
>> intel and one nvidia. I have to force the two X servers to be on the
>> same vt to trick both X to "render" at the same time, rather dirty too.
>>
>> Performance isn't great as expected. I got quite a lot of
>> tearing when I was testing with some GL apps/games. But it's usable at
>> least.
>>
>> I haven't tried your windump app yet. I believe you can get
>> some performance gain by using the Xdamage protocol, which x11vnc uses
>> by default. Though I'm not an X expert either.
>>
>> This is on an Alienware M11x-R2 btw.
>>
>>
>> Pigeon.
>
> Small steps for you... big ones for hybrid graphics I think :)
Follow ups
References