I guess I should expand why not a laptop, but it boils down to removing lots of little friction points that make ephemeral ideas harder to put down.
1. Touch first is important because your instruments are touch as well. Being able to record my live instruments and not have to switch to a trackpad to move between UI elements seems small, but it really keeps me in the zone.
2. Same point above, but basically being able to switch between my live instrument and an on screen virtual one is a big deal to layer up an idea. Can’t do that easily with a trackpad or mouse.
3. There’s something to be said for the mobile platforms single app as a focus model. I know you can full screen a Mac app, or split screen on an iPad, but having a singular focus helps reduce distractions.
4. Weight/portability are quite different. Maybe a MacBook Air would be better, but I take my iPad Pro to more places with me than I take my much heavier MacBook Pro . I’m already taking it to draw on the go, so this is just one more thing I can do without needing to buy a new device or take a heavier one.
Edit: oh and number 5 is a big one. I can put my iPad Pro in my existing music stand that works for me at multiple heigh levels. There are laptop stands that might work too, but I already use my iPad for sheet music, so it’s already a convenient setup.
I think there is a lot going for the music ecosystem on ipad - lots of affordable apps, cheap effects compared to desktop etc.
At the same time I don't think the device itself is built with the connectivity for the kind of music making I think about. If all your synths and effects live on the iPad and you don't care about after touch maybe, but if you want to hook external gear up the iPad just isn't made for it.
no headphone out. requires an external audio interface to connect to hardware synths. not an aftertouch capable control surface. Not really portable if I have to lug a bunch of supporting equipment to make it usable.
I agree to a large extent, but I've been pleasantly surprised just how much audio hardware works using a USB hub, especially now all iPads have USB-C ports with the exception of the 10.2 inch entry one.
Given how many USB accessories now work with the iPad, I really really wish Apple would relent and at least give us two USB ports on the side like the MacBook Air.
I get a spare USB-C port from my Magic Keyboard. I generally plug the charger into it and something else into the on-board port, but I have the option to put devices into both ports.
Regarding aftertouch, it does support it in a way (and it's shown in the video). For many synths, after you press a key on the on-screen keyboard, if you hold it and move your finger up towards the back end of the key, it gets translated into aftertouch. GarageBand on iPad has supported this as long as I've been using it (a few years). I don't know if every synth supports it, but there are definitely some who do.
I think I have seen that some users connect AVB capable devices (e.g. from Presonus or Motu) to the USB port of iPads and got full connectivity to the other studio setup via AVB networking. Some AVB devices are quite affordably nowadays. That would also allow for great audio routing setups in a gig I assume
I have a presonus audio interface. It works 100% with my ipad pro and has for years. But if I have to take an ipad and an audio interface and and and then why do I need an ipad at all? I could be using my laptop and have more options.
Even as a paying Bitwig subscriber, I don't get it. My instruments are velocity and aftertouch sensitive. Having a touchscreen interface for these instruments is no better than typing on your keyboard to play notes. And then if you want to plug in hardware, you have to either settle for one thing and no charging or buy a dongle.
> being able to switch between my live instrument and an on screen virtual one is a big deal to layer up an idea. Can’t do that easily with a trackpad or mouse.
Keyboard shortcuts exist. Failing that, it's one click on my trackpad to switch channels.
> There’s something to be said for the mobile platforms single app as a focus model.
There's also something to be said for multitasking desktops that give you the flexibility of both.
It's a neat option for people who already use Logic Pro, but touch-enabled DAWs have existed for over a decade (and they all kinda suck - ask me how I know). Besides the form-factor of the iPad being fairly convenient, there's no way I'd trade-in my laptop's IO and OS for a mobile one. The iPad doesn't offer enough flexibility to replace my DAW laptop.
Every time you’ve replied to one of my comments in the past, it’s been very binary in terms of workflows and the discussions always devolve.
So this time I’m leaving it as: I explained why it’s a big deal for me. If you have your own workflows or needs, so be it. But it’s irrelevant to why I like it.
I don’t need velocity for a quick idea for example. I just need to get the idea out as quickly as possible. Besides, you can always connect an external midi keyboard or audio interface to an iPad as well.
All that's well and good, I'm mostly just agreeing with the people who suggest a Macbook for this. The iPad runs iOS, it's gimped for all kinds of creative applications. I won't judge you if you've found a workflow to apply it to, but the status quo feels unchanged. Professional DAWs existed on iPad before, the world beneath our feet didn't crumble or anything.
> Besides, you can always connect an external midi keyboard or audio interface to an iPad as well.
Sure? Everyone can use class-compliant USB audio hardware. Android and Linux both support it, for whatever that's worth. Both of those OSes even support audio plugins, too.
Not everyone can install MPE drivers or get that Focusrite interface set up properly without software though. That's why I still carry a laptop, it may not apply to everyone.
How hassle-free it is for a given workflow or user is highly subjective and relative to the task at hand. The file management on iOS is still a dumpster fire.
Lacking plenty of common power user functionality to do things efficiently. It's fine for casuals, they don't know what's missing, but it's pretty poor versus a fully-fledged file manager on a desktop computer.
I'm with you on every one of your preference points. My own extra one is that I spend all day every day at work on my Mac, when I want to relax and work on music I want to physically separate myself from the thing I do my work on. I embraced touch in the first place just for that, but now I'm infinitely more productive for music on my iPad than I am on desktop. I can reach a state of flow that's just not there when I use Logic on my Mac.
Ah, I was going to join in with my own thoughts but this conversation seems to have gotten a little... personal...
For the record, I find touchscreen interfaces rather underwhelming. I've tried to love my iPad for creative purposes but I just find trackpad/mouse + keyboard faster and more versatile.
Of course - both pale in comparison to physical buttons. I dream of a touchscreen with really convincing haptics. That would be the best of both worlds.
It should go without saying, but I’m not talking about absolute workflows because there are no such things.
Workflows are subjective and contextual. I too use a Mac for a lot of audio work, and I’m not suggesting that one replaces the other in all forms.
I’m simply saying why this is a big deal for that one part of my workflow that had a hole which a Mac doesn’t cover.
The whole “touch vs not” discussion doesn’t add much because it’s not a binary. It’s only a subjective preference for me in my workflows in a specific context.
I’m simply elaborating on why I prefer an iPad in this context. Rebuttals are pointless as a result because it’s not a debate, it’s an elaboration of choice.
So again, I would say to people: “you do you” because my workflow doesn’t impose or force other people’s choices with their own trade offs.
I think there's a difference in conversation style here.
If somebody says "I think x" I've always felt it was entirely reasonable to respond with "I disagree because y" - that's not being argumentative, it's just having a conversation.
You're not the first person that seems to feel differently about whether or not it's impolite to respond in this way. It just always surprises me when I hit this particlar difference in communication preferences.
That’s fair, and perhaps an issue of tonality being lost in text or cultural differences in speech styles .
though part of the issue is “I disagree because Y” is not relevant because it’s an opinion with no negative effect on those who disagree (as opposed to say opinions on very factual science). So “I disagree because Y” is implicitly a rebuttal.
Instead , the subtly different “I see where you’re coming from however for me Y” is not a rebuttal. It’s an acceptance that two views may exist simultaneously and be equally valid.
If you look at the responses of the person I responded to, they’re always in the form of “well I need this so it makes no sense”
Note that they never acknowledge that others may not have the same needs or views. There’s an absolute way in all the interactions I’ve had with them.
The simple act of accepting another’s perspective changes the post from a negation to something that builds upon it.
Yeah. I probably would have found smoldesu a little brusque if they were replying to me. But on the other hand I'm sure I'm a little too brusque for by other people's standards.
Or 3D touch! Aftertouch and velocity are both possible on a touchscreen, the tech just never lived long enough to give it an earnest try. I'd be very curious to see how that works if someone had a POC up and running, doubly so with good haptics.
>Even as a paying Bitwig subscriber, I don't get it. My instruments are velocity and aftertouch sensitive
So what? You can get an idea down without either. It's not about making the final performance (though you can also program the DAW piano-roll this way just fine too).
>And then if you want to plug in hardware, you have to either settle for one thing and no charging or buy a dongle.
God forbid you have to buy a $20 dongle, or like, just use the device without one, and enjoy merely 10 hours of use instead of 15.
I bring up Bitwig because it was originally made as a touch-enabled DAW. I daily-drive it on a laptop with a touchscreen, and used to even have it configured to run on my Surface Pro. Despite all this, the touchscreen was never really intended as a performance surface. If you look at how Bitwig (or even the new Logic) is designed, this is obvious: the portion of the screen reserved for performance is minuscule. Touchscreen devices are outclassed in expression by cheap XY pads with aftertouch sensitivity. Everyone, Apple included, knows people are not going to use this as a touchscreen MPC. Akai's touchscreen MPC does not rely on a touchscreen for expression.
> God forbid you have to buy a $20 dongle
God forbid the iPad ever exposes a second USB port worth of IO bandwidth.
Actually, I take it back. Knowing Apple, I'll get my wish fulfilled with 2 Lightning ports and some MFi cable DRM...
There seems to be a divide between people who find the extra level of abstraction with a keyboard/mouse to be a problem. I don't understand it either, to the point where I see people try to flex with a room full of synthesizers, I just roll my eyes as I pull up my favorite VST
FWIW, the iPad is velocity sensitive. I haven't tried Logic Pro on it yet, but with Garage Band, it definitely plays louder if you hit it harder (for patches that support it). Some software synths in GB also have a pseudo-aftertouch where you slide your finger up after hitting the key to invoke aftertouch.
That said, I don't like playing on a screen. I'm a long-time piano and keyboard player so I just hook up an external instrument. But I was pleasantly surprised by what I could do just on the iPad with GarageBand.
I wonder if this works by measuring the size of your touch (i.e., how much your finger "splats" when it touches) or using the accelerometers/gyroscope or what.
have the UX such that you can have multiple screens that you swipe btwn... such that you can add screens for DRUM MACHINE 1 through N and you just swipe btwn them... and swipe up or down for media access and other screens.
it would be cool to have both apps (final cut for vids and logic for trax) running at the same time with "integration" such that you can swipe btwn video and sound editing on the same final output...
I would assume more people using Final Cut Pro will use a keyboard case, but for mobile music production, it there may be more people willing to go without the keyboard case, particularly if you are mostly using it to noodle down ideas, use the on screen instruments, etc.
And pedals! I've half-seriously considered experimenting with an expression pedal as a vertical scroll controller for desktop applications, using a second pedal to switch between relative (pedal position relative to center → scroll velocity, zero movement when pedal position is centered) and absolute (pedal position → page position, zero movement unless pedal position changes) modes.
For wide multicolumn IDE/text-editor layouts, spatially-mapped fader banks and/or organ drawbars might also make useful scrollers.
The 11-inch iPad (Pro) is much more compact, imo. You can also just mount it flat against some hardware in a rack, or against a wall in a small studio booth.
Personally, I feel like the iPad is very "flow" and the computer is very "precise." When I'm in the zone, just vibing, I'd much prefer to bust out musical ideas on an iPad (assuming no instruments available, of course). When I want to tweak the exact reverb parameters I'd rather have the computer.
Macbook?