unity-design team mailing list archive
-
unity-design team
-
Mailing list archive
-
Message #01134
Re: Fwd: Proposal of new UI element for windows in Ubuntu: Esfera
2010/3/26 Mark Shuttleworth <mark@xxxxxxxxxx>
>
> Hi folks
>
> Got this interesting proposal from Pablo, and thought it should be sent to the list rather than handled in private correspondence. It reminds me of something David Siegel was sketching out, also inspired by the challenge of "how we can make the most of the new space".
>
> Pablo, if you're not subscribed to Ayatana, it's the best place to sketch out a proposal like this.
>
> I appreciate both the detail in the proposal and the relaxed way it's pitched!
>
> Mark
>
> -------- Original Message --------
> Subject: Proposal of new UI element for windows in Ubuntu: Esfera
> Date: Fri, 26 Mar 2010 00:42:45 +0100
> From: Pablo Quirós <mr.polmac@xxxxxxxxx>
> To: mark@xxxxxxxxxx
>
>
> Hello Mark, I've got a proposal on the buttons' subject. It is a new element to be placed in the free space on the top-right of the windows.
> At first I wasn't very convinced on the UI change, and we exchanged a couple of messages on the matter in the related bug report, but I've thought about it and I agree with you that this could bring interesting possibilities.
> I've designed a concept called Esfera, which I think could be a huge step forward to the user experience, while bringing innovation to the Ubuntu desktop. The idea is explained in the attached PDF; I hope you can take the time to read it or at least send it to the Canonical Design team. Sorry for the mockup; I'm a disaster using GIMP, but I hope it illustrates the idea.
> I'd be very pleased to answer any question you may have about it. I'd just request that if you implement the idea, I appear somewhere as the author of the concept, and I've be glad if you kept the name I've chosen.
> Of course, there are lots of ideas that go nowhere, so I'd perfectly understand if you consider it useless -- just thought it was good and wanted to share it with you.
> Regards,
> Pablo Quirós
>
Hi Pablo!
I love your idea of having a widget that represents "the window", but
I'm not too keen on the gestures; the rest of our desktop has no
reliance on the things, aiming more for physical, direct interaction.
What this has me thinking about is a cool thing the Libwnck panel
applets like the window list and workspace switcher have, where you
can drag and drop the windows that are represented in them. For
example, I can drag and drop from the window list to another workspace
on the workspace switcher and that window is moved to that workspace.
Really slick, elegant stuff.
However, the potential beauty of that metaphor dies at the actual
physical objects that are real top level windows; dragging THEM is
just to position them spatially.
It would be interesting to have your Esfera widget, but draggable as
in standard drag & drop. The rest can happen by improving that
interface, standardizing the "window" drag & drop data type and
enhancing the various components that make use of it.
For example, GNOME Shell has a big thing about dragging / dropping
windows in the overlay mode. (And this is all Really interesting with
GNOME Shell, because there is only one representation of a window at
any given time, instead of a window list and a spatial view; it feels
more direct). With the current version in upstream git (not in Lucid),
dragging and dropping a window in the overlay turns that window into
something more like the representation you get when dragging and
dropping stuff (like files) and zooms out slightly further, revealing
nearby workspaces you can drop that window to, or a button to add it
to a new workspace. We could enhance that design by having it work
within a workspace, too, so you don't need to use the overlay mode to
add a workspace visually; dragging the Esfera widget would zoom out,
revealing hotspots on the screen to perform different interactions.
(Could even be the same list you describe).
At that point, it may be interesting to improve drag / drop in
general. I believe the system can be aware of any droppable areas for
the current drag operation, so why do we torture our users by forcing
them to drag a widget all that way there over a 38" 4k pixel monitor?
Snapping or physics (think kinetic scroll, with magnetic hot zones)
could be worth some experimentation. (All things considered, the end
result of that would basically feel like gestures, but be more
wholesome interactions).
By turning this into a standard drag & drop operation with a standard
data type, future technology adjustments would figure in here very
smoothly. For example, in the direction we're going — tabs being
handled by the window manager, a-window-is-a-document interfaces —
we're bound to want to assign a file URI to a window...
In short, I think an approach where the actual actions happen
*elsewhere* would allow us to communicate with a richer, more flexible
vocabulary.
Dylan
Follow ups
References