Xcode has a Touch Bar simulator and it actually works system-wide-- so if you're developing for the thing and need to get a feel for how Apple's using it, you can.
You do have to update macOS, though-- it's not included in the 10.12.1 build released a few days ago. The build you'll need is linked on Apple's developer site: https://developer.apple.com/macos/touch-bar/
I'm having a little trouble figuring out if this comment was sarcastic or not. Regardless, I think it illustrates pretty clearly why a lot of us are really uninterested in the TouchBar.
An incoming call isn't an alert. It's a highly time sensitive notification.
An alert is something that blocks further input until it is dismissed. Something along the lines of "Are you sure you want to disable encryption?" or "This item can not be deleted because it is locked."
I think that it was providing shortcuts for interacting with an alert already displayed on the main screen. Apple's worry, I'm assuming, is that alerts will be displayed only on the bar and people without the new Pro will not notice them.
When the OS hasn't initialized Touch Bar, what happens to the Touch Bar? Is there a default Touch Bar loaded in firmware in cases of Linux or Windows in Boot Camp?
Good luck with that. The Boot Camp drivers have all but been abandoned. Maybe they'll do a single update soon that will be the only update for the next 5 years.
Source: disgruntled Early 2013 MacBook Pro user who can't use the built-in microphone from within Boot Camp.
It actually appears that the 'T1' processor, running a watchOS variant, in the new MBP might be controlling the Touch Bar so this might very well be possible.
This, along with the ditching of MagSafe means I will not be purchasing MBPs anymore. I know that nobody else has magnetic charging now. Hopefully someone does before I have to upgrade.
But terrible chargers they are. I had to replace mine twice once for my Pro and once for Pro3. The cable bends very unnaturally and is gradually torn from the magnetic connector. Other chargers have a driver piece of plastic where the cable enters the connector that bends nicely to avoid this.
I prefer it simply because it's an industry standard and it means we don't get idiotic lawsuits like the one against the company that made Apple external batteries by buying Apple chargers and cutting off the cords.
USB Type C is standard and far more powerful than Types A or B. The ecosystem is still maturing, and I hope this drives it forward as Apple's switching to Type A on the iMac did so 15 or so years ago.
There's something to be said for not trying to put everything into a single expensive physical connection standard that requires complex active cabling.
USB was already a lot of software (and hardware) complexity just to attach a low speed peripheral like a mouse or a keyboard.
Switching power delivery over to USB-C has meant the elimination of MagSafe connectors.
Replacing fixed solutions like Ethernet is a step back to AAUI/AUI and MAC dongles, except now, you have to put the entire ethernet controller and a USB or TBolt controller in an external dongle.
Too bad there isn't a way to have your app control the touch bar even when it isn't in focus. I would have liked to write a background application that overrides it for personal use. I already have one that does that for the caps lock key.
I bet there could be some accessibility-related use for the touch bar. Also, accessibility is already tied to completely unrelated things like event taps.
I couldn't find an API for it, but some Apple apps do in fact do this. I've seen three examples of it:
- Media playback: When a song or a video is playing in a background app, there is an extra button in the system button part on the right that lets you access a media control strip. From it you can pause the current video / audio and even scrub through it.
- Xcode debugging: While Xcode is attached to a running process, there's a button in the system button part on the right side of the touch bar lets you access a debugging control strip, which lets you pause execution and step through the program. This access button is actually in the same place as the media control button, and in cases where both would be shown, the media control button wins.
- QuickTime screen recording: When you start recording the screen from QuickTime, the current recording time + a stop button is displayed in a touch bar overlay even if you focus a different app. However, once you do switch to a different app, you can close the overlay, and it minimizes into that same button slot on the right of the touch bar.
It surely would be nice to be able to do these things without private APIs, but I couldn't find anything so far.
Xcode 8.1 does that (visible in the TouchBar simulator). They add a global key to control the debugger (Pause/Resume program execution, toggle breakpoints…) while you're in your app, and away from Xcode: https://cl.ly/0v0z3b2S3Q0a
It's a very tiny space but I wouldn't be surprised to see someone try sticking ads there.
However, it sounds like in the absence of private framework usage, the only way for your app to show anything in the bar at all is for it to be in the foreground.
So it would seem like less of an issue than even showing ads on the main screen, where any app can technically create windows that float over all the others and can't be moved by the user.
"""There is no need, and no API, for your app to know whether or not there is a Touch Bar available. Whether your app is running on a machine that supports the Touch Bar or not, your app’s onscreen user interface (UI) appears and behaves the same way.
The Touch Bar dims automatically and wakes when the user touches it. Do not show alerts in the Touch Bar, and do not use the Touch Bar for widgets."""
Nothing stopping the developer from showing alerts or ads though right? Your app might not know if it's enabled, but then again can just run 100% of the time in case it is.
Unless the OS is rendering the bar off-screen even when there isn't a hardware bar, you could trivially determine whether there is a bar by having the code that controls the bar set a flag / send a message.
I sounds to me that it works like an interface. You call function `doFoo()` and if there's a foobar, the OS does foo. If there isn't a foobar, the method does nothing (but both return STATUS_FOO_SUCCESS).
For the touch bar, you aren't doing the actual rendering; the OS is. Only the OS knows if there's a touch bar, so without a kext or something, it seems there's no way for you to know if the touch bar exists.
Rendering and uploading something (or sending some messages to the touch bar controller; or queuing messages to the touch bar) probably takes a tiny bit longer than
if(!touchbar) return MAKE_SUCCESS_GREAT_AGAIN;
But then again maybe the user space component just doesn't have that conditional.
So who's gonna be working on Sublime/Atom/Vim/Emacs integration on this. Shit, imagine all the things one can do with that bar (multi-touch, slidable, swippable).
What can I do with the bar that I can't do with keyboard commands and my multi-touch touchpad? Vim seems especially rough, with no physical escape key. Guess I'll map that to caps lock?
Increase/decrease font size, adjust window height/width. I know you can do it with keyboard, but perhaps touch bar could make it more "natural" and free up key bindings for something else. Escape can be mapped to a double width button on the left.
How is this better? I don't have to look at the keyboard to perform chords. With no tactile feedback, I certainly have to look at the touch bar.
What's more, common actions usually have high-priority, simple chords (cmd-c/cmd-v, for example) --- whereas complex chords are used for infrequent commands (cmd-alt-shift-c to re-assign the origin in blender, for example). If we're going to surface commands in the touch bar, intuition says to surface the common commands, and leave infrequent ones tucked away, meaning that even with the touch bar you don't have quick or simple access to those commands.
This seems like a developer-centric viewpoint. Users of my app are not tech savvy and this could be a very convenient way of providing shortcuts to things. We already offer customizable keyboard shortcuts but nobody uses them.
I think it's a very "I can type without looking at my keyboard" centric view. If your keyboard shortcuts are already going unused it may be a discoverability problem, or it may just be that people who aren't power users will always prefer the mouse/track pad. If that's the case then I'm not sure glitzy icons on the keyboard help them.
This is also Apple-specific hardware, which kind of flies in the face of becoming a "standard".
I hope it doesn't get adopted by other laptop manufacturers. Imagine it does. Then we'll need desktop keyboards to support this standard. I like my wireless keyboard's battery life, and I don't see any benefit in buying a more expensive keyboard with decreased battery life in order to support UI I don't intend to use.
There can still be a JS API for it that's not standardised. See Apple's JS APIs for Force/3D Touch in Safari on Mac and iOS. Unfortunately though they both have different APIs :(
In saying that though, I doubt Apple would actually make this available for website.
They showed Safari using the bar for various general browser functionality, e.g. back button and tab switcher, so that would require adding a toggle to switch from that UI to the one the webpage has defined.
I think Apple is going to use this to differentiate its native app offerings from competing web apps, particularly Google's web apps, so I doubt they put much effort into that.
Its not likely that Apple will really start to embrace the web, but we can go the other direction! Somebody could figure out how to tie this into an Electron API. (It would be awesome to use the touch bar with Atom, for example)
With a bit of legwork, you could refactor and ship a React app with react-native-macos, and write a module that allows you to use NSTouchBar from JS.
If someone can figure out how to trick legacy MacBooks into displaying this bar on the screen of a plugged in iPhone/iPad then you can take my money now.
There are so many old iPhone and iPads out there just waiting to be repurposed for something like this.
Heck, I'd even be interested in a reasonably priced (< $150) external touch strip as an accessory for my current MBP.
> Because the Touch Bar is designed to work with AppKit, it is fully accessible.
> Be sure to use the customizationLabel property on every NSTouchBarItem instance that you designate as customizable (as described in NSTouchBar Customization). The accessibility system in macOS makes use of these labels.
I can't see how they would even begin to make that possible. For the next couple of years, the majority of MacBook users will not have a Touch Bar. This means developers will only use it to duplicate functionality that is already accessible in applications. You will not see features that are exclusively found on the Touch Bar.
It is encouraging to read this in Apple's documentation[1]:
"There is no need, and no API, for your app to know whether or not there is a Touch Bar available. Whether your app is running on a machine that supports the Touch Bar or not, your app's onscreen user interface (UI) appears and behaves the same way."
Skip forward 5 years to a time when software developers assume that every Mac user has a Touch Bar. We will get to see just how poorly they make use of the feature. Even as someone without any disabilities, I sincerely hope I am never absolutely required to use the Touch Bar for features that cannot be found elsewhere in an application.
TouchBar will play nice with digital audio workstations like Logic Pro / Live, etc. I'll be happy to record pitch/mod/any VST param automations on a touch strip.
DJ-ing apps could show up cue points and let you trigger them.
While fullscreen mode on macOS advocated distraction-free focus, TouchBar takes a step backwards.
Actually very pleasantly surprised that this version 1 API has this much publicly customizable features. My guess was that this would be locked down in the first new gen MacBooks
I want a device API for third party keyboards, and a split toolbar more for split keyboards. So that third party ergonomic keyboards can be developed building on this.
Most likely Apple is not going to rewrite everything in Swift as that usually leads to more bugs due to missing documentation, but they are pretty clear it is a replacement to C and Objective-C for new code.
"Swift is a successor to both the C and Objective-C languages."
Hopefully the next XCode build will include a TouchBar simulator so there's no need for me to get a new MBP in order to work on this.