Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Lovers of Touchscreens with DAWs
#31
Basically, yes... there shouldn't be any reason for applications to support a touch screen specifically, unless the OS doesn't support it. But if the touch screen comes with its own driver (presumably it does?) you'd imagine that the driver would serve the OS, rather than individual applications.
Knowledge is knowing a tomato is a fruit...
Wisdom is knowing you don't put tomatoes in a fruit salad !!
Reply
#32
@Owlmerlyn:

Most touch-screens appear to the OS as a mouse device. For example I can control Mixbus using the built-in touchscreen of my son's Windows10 laptop.

As you mention, a regular touchscreen is missing some basic features of a mouse: multiple buttons and a scroll-wheel.

A multi-touch touchscreen can simulate some of those features using gestures: for example: two-finger touch might emulate a right click, or 2-finger drag might emulate a scroll wheel action. In theory, these gestures can be configured to operate on Mixbus as-is. I've never investigated it, but presumably you can configure your OS to send a "scroll wheel" action when the user does a 2-finger drag. Or trigger a "zoom" action when the user does a 2-finger "pinch". Has anybody tried that yet?

However this still doesn't get true multi-touch operation. If Mixbus was re-programmed to support multi-touch, we would also get the benefit of (for example) touching 2+ faders at one time. And I think that's the feature that gets most touchscreen users excited.

Some would argue that a $499 Faderport8 provides a better multi-fader experience than a touch screen. However, a touchscreen has its own benefits: you can show a lot more information on a big monitor than a little control surface. Note that to "really" get the benefits of a touchscreen, the program would need to be developed with tablet-like conventions.

There's not much economic incentive for Harrison to develop such a multitouch interface at the moment, because only a percentage of our users have a dedicated desktop with multitouch capabilities. Furthermore we have our own development agenda to pursue which is largely focused on the "sound" of Mixbus and associated plugins. It's not that we don't want to implement multitouch - it's just that we already have lots of big plans that will apply to "all" of our users.

HOWEVER: The beauty of our open-source development model is that someone outside of Harrison -could- develop this mutitiouch interface; either by changing Mixbus itself, or by creating an invisible "overlay" window which intercepts multi-touch actions, and transmits them to Mixbus. The Slate Raven, for example, has an implementation like that: it captures the touches, and then re-transmits them to the underlying DAW using various available protocols. Since Mixbus is open-source and therefore has no limits on what can be controlled, the integration with Mixbus could be much tighter.

If our users -really- want this feature, there's nothing stopping you from funding and developing it!

-Ben
Reply
#33
Physical faders being obsolete is a bit of a stretch. And I still can't figure out how I can use a touch screen consistently without looking at it. Like I can with a physical mixer. So both have their advantages. Let's not say one is vastly superior to the other just yet.
Reply
#34
I'm not totally disagreeing with you. IMO the perfect controller is mixture of both real knobs, buttons, and faders along with a touch screen.

I've tried to explain a rough idea of my vision before. But mixbus is in a unique position. Their software is designed to emulate the hardware workflow. So you make the real faders and knobs control just the mixbus part 1 to 1. And there are several ways to do this. Probably make it 1 to 1 for 32C and have some unused buttons for regular mixbus would be the simplest.

Then you have a touch screen to be able to control any plug-ins and anything else that isn't part of the Harrison mixer. Because I don't edit plug in parameter by feel. All "I" really want/need is faders and transport control to be tactile. But having the EQ and buss sends available also would be icing on the cake. The rest can be touch screen.

IMO that's the best of both worlds.
Reply
#35
(08-02-2017, 07:59 AM)johne53 Wrote: Forgive my ignorance but isn't a touch screen really just an elaborate mouse? [...] I'm saying "elaborate" because a touch screen can presumably send at least two pairs of touch co-ordinates (e.g. to facilitate zooming)

This brings up an interesting point that I'd never thought about...

The only touch screen that I own is my mobile phone. If I touch two parts of the screen and move my fingers relative to each other it changes the zoom factor. This seems to be the same for all applications (i.e. it's a built-in part of the touch screen's functionality?)

But for audio apps... that isn't normally what you'd want. If you touch (say) two faders you'd want your movements to change the sound level on those two channels. You wouldn't want it to start varying the zoom factor.!

How does a touch screen know this.??
Knowledge is knowing a tomato is a fruit...
Wisdom is knowing you don't put tomatoes in a fruit salad !!
Reply
#36
(08-02-2017, 08:49 PM)Matt Wrote: I'm not totally disagreeing with you. IMO the perfect controller is mixture of both real knobs, buttons, and faders along with a touch screen.

I've tried to explain a rough idea of my vision before. But mixbus is in a unique position. Their software is designed to emulate the hardware workflow. So you make the real faders and knobs control just the mixbus part 1 to 1. And there are several ways to do this. Probably make it 1 to 1 for 32C and have some unused buttons for regular mixbus would be the simplest.

Then you have a touch screen to be able to control any plug-ins and anything else that isn't part of the Harrison mixer. Because I don't edit plug in parameter by feel. All "I" really want/need is faders and transport control to be tactile. But having the EQ and buss sends available also would be icing on the cake. The rest can be touch screen.

IMO that's the best of both worlds.

That's what I'm trying to have/do. I will probably first have a 10" touch screen for plugins to complement my MCU. Thinking of perhaps using a tablet with an Xserver, just don't know yet how to easily beam the plugins' GUI there.
Next will be my hardware project to create a 1:1 version of the channel strip and a master section. If you keep this modular you can have channel strips for both Mixbus and Mixbus32C available. And Mixbus strips.
Now on top of every strip you can put either a "dumb" meter - or a little (touch) screen for more comprehensive info/manipulation. In case of a touch screen it could display a list of plugins in the channel and when you tap an item it shows the plugin GUI for direct manipulation or it displays this plugin onto the abovementioned (bigger) touch screen, which in case of my personal workstation will sit on the right hand side, next to the monitor station. One tap and you have the right plugin for the right channel right under your fingertips, without disturbing the mixer"view" and the hassle what should be in fore/background and have focus, no window moving to reach for a fader which is hidden beneath. You can adjust the reverb with your right hand while adjusting the EQ directly in the channel strip with your left.
(I believe I have written something along these lines earlier... never denied a touch screen is an excellent complement to a control surface)

MMM
Reply
#37
(08-02-2017, 10:56 AM)Ben@Harrison Wrote: Most touch-screens appear to the OS as a mouse device. For example I can control Mixbus using the built-in touchscreen of my son's Windows10 laptop.

Ok, so a bit like generic MCU mode vs Native mode for fader controllers. But the point is some functionality already exists "off the shelf"...

(08-02-2017, 10:56 AM)Ben@Harrison Wrote: As you mention, a regular touchscreen is missing some basic features of a mouse: multiple buttons and a scroll-wheel.

A multi-touch touchscreen can simulate some of those features using gestures: for example: two-finger touch might emulate a right click, or 2-finger drag might emulate a scroll wheel action. In theory, these gestures can be configured to operate on Mixbus as-is. I've never investigated it, but presumably you can configure your OS to send a "scroll wheel" action when the user does a 2-finger drag. Or trigger a "zoom" action when the user does a 2-finger "pinch". Has anybody tried that yet?

Yes, exactly. Is it not possible that touch screens come with drivers or an "inbetween" layer that emulate the data from the screen as if coming from a mouse? From what I understand Raven has to be "implemented" into DAW's but I don't understand why it can't work in a generic mouse mode even if there is no native implementation with added features.

(08-02-2017, 10:56 AM)Ben@Harrison Wrote: However this still doesn't get true multi-touch operation. If Mixbus was re-programmed to support multi-touch, we would also get the benefit of (for example) touching 2+ faders at one time. And I think that's the feature that gets most touchscreen users excited.

Yes, and this is where I do understand the excitment. Multi-touch makes the world of difference to the work flow and "playability" of the mix. I like the tag line from the Raven ad which refers to the Raven as an "instrument".

(08-02-2017, 10:56 AM)Ben@Harrison Wrote: Some would argue that a $499 Faderport8 provides a better multi-fader experience than a touch screen.


And I might just agree, lol.

(08-02-2017, 10:56 AM)Ben@Harrison Wrote: However, a touchscreen has its own benefits: you can show a lot more information on a big monitor than a little control surface. Note that to "really" get the benefits of a touchscreen, the program would need to be developed with tablet-like conventions.

I agree, having a lot more information available is a big plus of touch screens. As well as being able to access that information without having to bank through 8 faders at a time.

(08-02-2017, 10:56 AM)Ben@Harrison Wrote: There's not much economic incentive for Harrison to develop such a multitouch interface at the moment, because only a percentage of our users have a dedicated desktop with multitouch capabilities. Furthermore we have our own development agenda to pursue which is largely focused on the "sound" of Mixbus and associated plugins. It's not that we don't want to implement multitouch - it's just that we already have lots of big plans that will apply to "all" of our users.

If I can throw something in here. I obviously do not have to concern myself with budgets and corporate strategy which always fence in the realm of what is possible. And yes, we all do want you guys to keep working on the sound and things that are universally beneficial. But... based on my experience with Fadeport 8 it has struck me how much of a difference the physical controller makes to the mixing experience. For the first time I am actually enjoying the mix experience and starting to get an inkling of what it must feel to be one of the big boys sitting in front of an SSL, or actual 32C. Also, whilst I do continue to use Reaper for certain things, it really urks me to not have the same control and I want to get back to 32C as fast as possible. So, my point, it could be a real differentiator for Harrison (and strong selling point) to become known as a physical controller-friendly developer. Not all users would jump on board, but I think you would boost your hard-core fans quite significantly (for example, my hard-core fan-ness jumped multiple points during the recent upgrade of FP8 functionality). And hard-core fans help to sell the product. I am pretty sure that the Live / Push paring has done good things for Ableton, for instance. Anyway, don't want to be treading where I shouldn't, but those are my thoughts.

(08-02-2017, 10:56 AM)Ben@Harrison Wrote: HOWEVER: The beauty of our open-source development model is that someone outside of Harrison -could- develop this mutitiouch interface; either by changing Mixbus itself, or by creating an invisible "overlay" window which intercepts multi-touch actions, and transmits them to Mixbus. The Slate Raven, for example, has an implementation like that: it captures the touches, and then re-transmits them to the underlying DAW using various available protocols. Since Mixbus is open-source and therefore has no limits on what can be controlled, the integration with Mixbus could be much tighter.

If our users -really- want this feature, there's nothing stopping you from funding and developing it!

-Ben

This could be project for sure, it seems there are some quite tech-savvy people in this forum

(08-02-2017, 02:07 PM)Matt Wrote: Physical faders being obsolete is a bit of a stretch. And I still can't figure out how I can use a touch screen consistently without looking at it. Like I can with a physical mixer.

Amen to that. Exactly why I want physical faders
Reply
#38
(08-03-2017, 11:48 AM)Owlmerlyn Wrote: But... based on my experience with Fadeport 8 it has struck me how much of a difference the physical controller makes to the mixing experience. For the first time I am actually enjoying the mix experience and starting to get an inkling of what it must feel to be one of the big boys sitting in front of an SSL, or actual 32C.


This. Exactly this.
Reply
#39
(08-02-2017, 02:07 PM)Matt Wrote: Physical faders being obsolete is a bit of a stretch. And I still can't figure out how I can use a touch screen consistently without looking at it. Like I can with a physical mixer. So both have their advantages. Let's not say one is vastly superior to the other just yet.


I also agree that saying that physical faders are obsolete is a stretch.

I do not totally understand the idea of using a physical mixer without looking at it. I owned an MCU Pro and I would still look at it to touch the correct fader(s). Sure I could close my eyes after finding the correct fader, but I wouldn't feel around for the correct fader(s) with my eyes closed.

(08-02-2017, 10:56 AM)Ben@Harrison Wrote: HOWEVER: The beauty of our open-source development model is that someone outside of Harrison -could- develop this mutitiouch interface; either by changing Mixbus itself, or by creating an invisible "overlay" window which intercepts multi-touch actions, and transmits them to Mixbus. The Slate Raven, for example, has an implementation like that: it captures the touches, and then re-transmits them to the underlying DAW using various available protocols. Since Mixbus is open-source and therefore has no limits on what can be controlled, the integration with Mixbus could be much tighter.

If our users -really- want this feature, there's nothing stopping you from funding and developing it!

-Ben

Any chance of Harrison working with Devil Technologies to make a DTouch version for Mixbus 32C? Just throwing it out there. ;-)

(08-03-2017, 01:09 AM)johne53 Wrote:
(08-02-2017, 07:59 AM)johne53 Wrote: Forgive my ignorance but isn't a touch screen really just an elaborate mouse? [...] I'm saying "elaborate" because a touch screen can presumably send at least two pairs of touch co-ordinates (e.g. to facilitate zooming)

This brings up an interesting point that I'd never thought about...

The only touch screen that I own is my mobile phone. If I touch two parts of the screen and move my fingers relative to each other it changes the zoom factor. This seems to be the same for all applications (i.e. it's a built-in part of the touch screen's functionality?)

But for audio apps... that isn't normally what you'd want. If you touch (say) two faders you'd want your movements to change the sound level on those two channels. You wouldn't want it to start varying the zoom factor.!

How does a touch screen know this.??

I do not think the touchscreen can determine this as is, but with custom programming multitouch fader use can be achieved. I use DTouch with Cubase and I can move 10 faders independently at the same time.
Reply
#40
(08-03-2017, 01:46 PM)rcprod Wrote:
(08-02-2017, 02:07 PM)Matt Wrote: Physical faders being obsolete is a bit of a stretch. And I still can't figure out how I can use a touch screen consistently without looking at it. Like I can with a physical mixer. So both have their advantages. Let's not say one is vastly superior to the other just yet.


I also agree that saying that physical faders are obsolete is a stretch.

I do not totally understand the idea of using a physical mixer without looking at it. I owned an MCU Pro and I would still look at it to touch the correct fader(s).

Oh, come on. Don't you play guitar or piano or bass or something? You don't have to look at the thing to play it most of the time. You just navigate tons of stuff *without* looking at it. Same thing with physical faders.
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)