Thread Rating:
  • 1 Vote(s) - 4 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Multitouch Capability
#1
Hi

I have been using Mixbus 32c for a few days and I'm really liking the sound and workflow. My primary DAW has been PT HD and I use a Slate RAVEN to provide me with multitouch capability.

With quite a few DAWs now available with native multitouch control on both Mac and PC such as Studio One, Sonar, Bitwig and FL Studio, are there any plans for Harrison Mixbus to have multitouch capability?

I really hope Harrison will consider this. I have been using single touch currently which is ok for things like plugin control and editing tasks. However, I do really miss multitouch and touch control gestures in the Mix Window for things like:

1. Moving multiple faders at once
2. Swiping mutes and solos
3. Banking
4. Panning

Maybe there could also be some dedicated touch buttons for right click functions in the Edit window.

Hope to see this happen in a not too distant future release.
Reply
#2
Hi Simon, are you on Mac or Windows?
Reply
#3
(12-02-2016, 05:12 PM)Ben@Harrison Wrote: Hi Simon, are you on Mac or Windows?

Hi Ben

I'm on Mac.
Reply
#4
Nothing more to this conversation, guys?

Similar to Simon, I am usually using Pro Tools, but with DTouch, on a Mac.

At ACP, we considered leaving the Pro Tools pasture about a year ago, and Mixbus was a consideration, but multitouch was important, as were MIDI features.

We also looked at Studio One.

We stuck with Pro Tools, but we are now coming up on decision time again.

Harrison needs to realize that folks are looking to find that one DAW that is fabulous at everything.

Can Mixbus become that?
Reply
#5
@brucerothwell: is that a joke question? Smile

Nothing can be "fabulous at everything". There's only so much screen space on the main window. Ableton Live chooses to fill it with a grid of sequencer boxes. ProTools fills it with a grid of sends/plugin widgets. Mixbus fills it with EQ and compressor knobs. They are all designed to meet the needs of a specific user.

Regarding multitouch, we are working on some multitouch solutions. But I really need to understand this:

Why aren't people just using a tablet on their desktop, with some multitouch faders and gesture-capability?

100+ years of industrial design says the correct ergonomic position for long-term work is to have the screen at eye-level, and the controls at elbow level. This could be achieved with a $50 tablet and perhaps $30 of software.

I've used such a solution on a tablet. It's at desk-level. It has 8 touch-sensitive faders. It allows me to swipe left/right through my session. Everything about it screams cool. But I find myself not using it, or preferring physical faders instead.

Similarly, I've used the touchscreen on my son's Win10 laptop. Mixbus works on it - touching mute/solo buttons in Mixbus is pretty cool, but the screen is wobbly and you don't have nearly the same fine-control as a mouse. So I've never felt like I missed it, when I was on a different computer.

Am I weird? Am I missing something? What's the "killer aspect" of a multitouch screen that isn't already available in a touch screen, or a tablet? I want to hear from a power-user who "gets it". We don't want to make a multitouch solution that is missing the secret ingredient!

-Ben
Reply
#6
one example is to have a multi-touch monitor in an angled position very close to the position where the person is sitting. This monitor can be at eye level, or much more below. Very convenient to reach out with one or two hands. I'm using 1920x1080 pixel touchscreens since about 10 years.

In the classic computer desktop setup I agree that a touchscreen doesn't make much sense. But nobody is forced to to have such a setup at home. Personally I never had it.
Mixbus32c, Mackie Onyx 1640, Neumann km1, WA 47 jr..MadronaLabs, Samplemodeling, UA, etc., iPad2/4/Pro
Reply
#7
(01-06-2017, 02:21 PM)Ben@Harrison Wrote: @brucerothwell: is that a joke question? Smile

Nothing can be "fabulous at everything". There's only so much screen space on the main window. Ableton Live chooses to fill it with a grid of sequencer boxes. ProTools fills it with a grid of sends/plugin widgets. Mixbus fills it with EQ and compressor knobs. They are all designed to meet the needs of a specific user.

Regarding multitouch, we are working on some multitouch solutions. But I really need to understand this:

Why aren't people just using a tablet on their desktop, with some multitouch faders and gesture-capability?

100+ years of industrial design says the correct ergonomic position for long-term work is to have the screen at eye-level, and the controls at elbow level. This could be achieved with a $50 tablet and perhaps $30 of software.

I've used such a solution on a tablet. It's at desk-level. It has 8 touch-sensitive faders. It allows me to swipe left/right through my session. Everything about it screams cool. But I find myself not using it, or preferring physical faders instead.

Similarly, I've used the touchscreen on my son's Win10 laptop. Mixbus works on it - touching mute/solo buttons in Mixbus is pretty cool, but the screen is wobbly and you don't have nearly the same fine-control as a mouse. So I've never felt like I missed it, when I was on a different computer.

Am I weird? Am I missing something? What's the "killer aspect" of a multitouch screen that isn't already available in a touch screen, or a tablet? I want to hear from a power-user who "gets it". We don't want to make a multitouch solution that is missing the secret ingredient!

-Ben

Hi Ben

When using multitouch on a touch screen, I think anything smaller than a 27 inch screen becomes more of a difficult experience. I have a RAVEN MTX and a RAVEN MTi. I have used Mixbus 32c on both and they gave me a very good experience with single touch. I use both my RAVEN's at a 40 degree angle which is perfect for my needs. I have been a RAVEN user for well over 3 years and working 8 plus hours per day isn't a problem for me at all.

Mixbus doesn't work in multitouch mode on any tablet or general touchscreen I believe. If it does, please let me know.

As for multitouch in general for DAWs, this has either been implemented by the DAW developer or applications such as the RAVEN software and D Touch provide the capability.

There is no doubt that touch is becoming popular in audio applications. So many manufacturers / developers have implemented some type of touch feature in their products / solutions.

Apart from the Avid S6, I am yet to experience a hardware controller that can compete with the RAVEN as far as speed, accuracy and overall efficiency of workflow. With the RAVEN and Pro Tools, I just reach out and touch the GUI and the software overlay is killer including the toolbar and batch commands. No other hardware product interaction required between my DAW and I.

Touch is certainly not for everyone. But no doubt more users are adopting touch technology within their workflow.

It's good to know that Slate will be looking to support Mixbus in the future. However, it would be even better if Harrison implemented touch and multitouch features themselves.

Just my 2 cents.
Reply
#8
then there are prototypes of touchscreens with some kind of haptic feedback. So that you possibly also could feel the fader, the knob you are interacting with.

https://tanvas.co/

Also, in aviation, glass cockpits are advancing on every level, be it on small airplanes or big ones.

https://en.wikipedia.org/wiki/Glass_cockpit

It's the best ergonomical solution. When people visit me, they often don't call my workspace "studio" or "desktop". They call it "cockpit".
Mixbus32c, Mackie Onyx 1640, Neumann km1, WA 47 jr..MadronaLabs, Samplemodeling, UA, etc., iPad2/4/Pro
Reply
#9
(01-07-2017, 12:44 PM)Phil999 Wrote: then there are prototypes of touchscreens with some kind of haptic feedback. So that you possibly also could feel the fader, the knob you are interacting with.

https://tanvas.co/

Also, in aviation, glass cockpits are advancing on every level, be it on small airplanes or big ones.

https://en.wikipedia.org/wiki/Glass_cockpit

It's the best ergonomical solution. When people visit me, they often don't call my workspace "studio" or "desktop". They call it "cockpit".

I watched the video on tanvas' home page... I'm sorry but it kind of turned me off... and it made me a bit sad too. My (grown) kids got to feel corduroy and golf balls - the real thing. Reminds me of a radio commercial a few years ago when a father and son go on a trip together to the country and the kids asks, "What that dad?". "Son, that's a tree". The kid replies, "Whoa!".

Don't get me wrong, I am not anti-technology but being pretty "old-school", I agree with Ben about the feel of real faders, knobs, and dials. Not to mention being able to have my eyes and brain focus on a solid object illuminated by daylight (when available) rather than luminescent and flickering (albeit rapidly) panels for hours on end. Yes, I can do infinitely more sitting at Mixbus and my three screen than I could years ago with a tape machine and console but that's primarily due to editing efficiencies and patch/configuration flexibility.

My prediction is that the cost of even a small TavasTouch device for the next five years could buy me 32 channels of decent hardware control surface.

TavasTouch technology IS cool and I can envision some interesting uses in education and medicine, and yes, entertainment.

D.
MB, MB32C, V10, Win 7/10, MacOS Catalina and Ventura, MBP & Ryzen platforms, nVidia and Radeon GPU’s, Focusrite Saffire Pro 40, Mackie Onyx 1640, X-Touch, 43” UHD, x2 27” 4K

Reply
#10
yes I agree. I don't expect these haptic-feedback touchscreens appearing very soon in everyday devices; it's just an example that touchscreen technology is still evolving and may become even more useful in future.

At my home studio, I have a multi-touch monitor for the Mixbus mixer window, but also a physical control surface (QCon Icon). I mostly use the physical faders, buttons and knobs, also very often the jog wheel, but depending on where my hands are, sometimes also the touchscreen. Where I use the touchscreen most is for interacting with plugins, especially synth plugins.

Now when I'm recording somewhere else I don't have the control surface with me. But I may have a multi-touch monitor available (because I use it anyway for the plugins). And it does make sense to have multiple input points, for example when you adjust an EQ frequency with one hand, and during that action you suddenly want to lower a fader with the other hand (without losing the first hand's focus).

While single-touch is sufficient for me, multi-touch is always better.
Mixbus32c, Mackie Onyx 1640, Neumann km1, WA 47 jr..MadronaLabs, Samplemodeling, UA, etc., iPad2/4/Pro
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)