Occasionally I'll write an app for my kids or wife. Every time I'm thoroughly impressed by the Apple development ecosystem and thoroughly disgusted by Google's for Android.
This is no different. The Android development process is painful (the most verbose, cruft and boilerplate filled Java), cumbersome to organize and build (Gradle is terrible, and buggy) and debug (the integration with Studio is just clunky). About the only thing Google does better is testing releases through the developer console.
It's nice to see them finally providing something similar to ARKit. I just wish they'd work on all the other things that make Android development a horrible experience.
There's a lot of complaints about Java being verbose and bad, does Kotlin fix these issues? Does it make it fun again? I personally used to dislike programming until I started using Python at work, and I fell in love again. Syntax, libraries and API design can definitely have a huge impact on how you enjoy what you do.
As for your comment about Activity/Fragments, they've had a few I/O talks going over their thoughts:
I would argue Kotlin is a lot more fun than Java. As for the Fragment lifecycle, it still sucks, but no one forces you to use it. Lots of apps never even use one fragment. Google simply made fragments hoping everyone would make their UIs into reusable components which work better on tablets.
The problem is the benefit you note requires a lot of extra effort and at least one additional layer of abstraction that more than doubles the boilerplate/cruft required to maintain proper UI flow through an application.
Their "solution" is more of a problem than the problem it was meant to solve.
I don't think Java 8 is that bad, it's just mediocre. It's fast and has a good ecosystem.
Yeah, Python is fun, it seems effortless to do stuff there compared to Java. For example, "a = [1]" vs "List<Integer> a = new ArrayList<>(); a.add(1)" - and then you have to compile and run, no REPL to try it out.
Recently I've been using Julia, it's maybe the most fun I had with a language. Some things are much easier to do than in Python - and it's 1 to 2 orders of magnitude faster.
Since you mentioned Julia - are you using it for math or math-related programming, or doing something else? It looks interesting to me, but I'm a web dev (Python on the backend) and don't have much use for most of the math stuff, so I don't know if it's suitable.
* Swift is one of the best-designed languages out there.
One feature I dislike about swift is the Optional type and the "?!" business, borrowed from Scala, I assume. It makes coding experience much less pleasant, yet providing little value-add. What is wrong with good old nil?
I find creating UI in Xcode, with Autolayout and IB, far better than using Android's tools. Mainly because IB actually shows me what I'm working on, and what it's going to look like. The Preview window in Android studio routinely errors out, telling me it can't preview my layout because of some error, often caused by something from Google's own compatibility library!
Is refactoring in Xcode using Swift still a horrible joke? It just seems odd that you would say Android Studio is not that much better than Xcode considering the horrible inadequacies of Xcode when it comes to Swift. Meanwhile, pretty much anything you can do with Android Studio, with respect to Java, you can also do in Kotlin.
> - Probably need an iOS device, as the emulator is very slow
Only if you're doing OpenGL, as that's actually rendered in software (not sure about if you're using Metal). Otherwise, the simulator (not emulator!) is fast, which is as it should be as it's running native code (which is why it's a simulator, not an emulator). If anything, it's important to test on devices because the simulator can mask performance problems (though as devices become more and more powerful that becomes less of an issue).
I'm sure you're correct, but that seems backwards. OpenGL is a cross platform API, why would that be emulated in software? And the iOS devices have a different CPU architecture (ARM) than the MacBook Pro (x64) you develop on, so how is that being run as native code?
I can only speculate as to why OpenGL is rendered in software, I assume it has to do with the fact that the graphics capabilities are different on iOS versus Macs, and perhaps software rendering is needed to ensure consistent behavior or implementation of OpenGL extensions (though I'm not positive that the simulator offers the same OpenGL extensions anyway; I'm not a graphics programmer so I haven't really explored this).
As for it being a simulator, it's because your app actually compiles to x86_64 code when you're targeting the simulator. When you switch between targeting the simulator and targeting a device, your app is recompiled for the new architecture. And the simulator includes a complete copy of iOS and all its frameworks (but without most of the built-in apps) that were compiled for x86_64 in order to run in the simulator.
OpenGL is not being emulated, it's being implemented. OpenGL is just the API and specifications, it's up to individual graphics hardware vendors to put a conforming OpenGL implementation in their driver. Ideally it would all be done very fast in hardware, but there are still times when a particular feature can't be done on certain hardware, and is performed in software to conform to the OpenGL spec.
Apple already has a complete software OpenGL implementation, which they may have modified to simulate the individual OpenGL ES implementations for each iOS device. This also has the advantage of removing the developer's hardware from the equation: If they want to test a bleeding-edge OpenGL ES app on a really old MacBook, it'll run - just slowly.
Anyone that uses the excuse that there are "Countless screen sizes to worry about" makes me wonder if they even know what they're talking about when it comes to Android development. The developer of Pocket Casts says this is a fallacy:
To whatever degree it was true it was a lot MORE true when you only had to worry about the original iPhone screen, or the original plus right now, or maybe the five.
It's no longer just one or two sizes on iOS. Probably a minimum of three assuming you don't want to do a tablet app, and that may change into weeks.
It provides several different flavors of the C++ STL, including libc++. `libc++` is specifically the implementation shipped by LLVM/Clang (which is also what Apple uses).
The next release of the NDK (r16; due out sometime this quarter) will stabilize libc++ for use by NDK applications and it'll be made the default STL in r17 (which will be out by the end of the year).
If you check the ARCore announcement, "ARCore works with Java/OpenGL, Unity and Unreal and focuses on three things:", shows what the position of the NDK is.
Nothing more than a way to implement Java native methods, which I completely ok with, given the security implications.
Just wished that since they have their own fork of the Java world, that they would also bother to provide something else besides forcing us to manually write JNI calls.
I think you are taking it the wrong way and you are not at fault, there is a lot of FUD about screen sizes.
Designing for flexible screen sizes and densities is pretty easy and I don't think I would gain any significant amount of time if Android was limited to 10 screen sizes.
You just think in term of dimension independant pixels & inflexion points where you adapt your design (one more row, multiple panes, etc)
Which really you're supposed to be doing on both OSs at this point. As an iPhone user there are still apps that don't support the 6 size screen and it's obnoxious.
iOS has four different retina screen resolutions on the phones, more on the tablets. For all we know there will be more in two weeks. Creating pixel perfect layout doesn't work very well anymore.
It may be easier to exhaustively test on iOS because there aren't quite as many variations, but devs should definitely be using flexible layouts.
And then of course there's accessibility. If your app is already designed to handle different screen sizes then it's easier to resize various elements because you want to bigger or smaller text.
> - Needs a quadcore Xeon with at least 16 GB and SSD to have an usable experience with Android Studio, or configure it to run in laptop mode
I'd disagree with the Xeon bit, I have a 6 year old Sandy Bridge quad core, and Android Studio runs butter smooth.
I'll confess to the 16GB of RAM and an SSD though. Although honestly an SSD now days is required for anything to be usable.
Android Studio is amazingly performant though, the Emulator is great, ignoring bugs and glitches and the occasional times it just stops working until I flip enough settings back and forth that it starts working again.
Of course a huge benefit is that I don't need Apple hardware to develop for Android.
> I also rather develop for Android, but Android Studio resource requirements made me appreciate Eclipse again.
There is a reason my dev machine is a Desktop. Better keyboard, better monitor, better performance. 6 year old machine, cost about $1500, performs better than the ultraportables a lot of people try to press into service for writing code. Even with a faster CPU, thermal throttling is a concern once the form factor gets to a certain size.
Ah interesting, when my team used external consultants, we did the inverse, we gave the consulting company a beefy requirements list and told them anyone sent to work for us must be at least that well equipped.
Paying by the hour, we were heavily motivated to minimize compile times. :)
>Needs a quadcore Xeon with at least 16 GB and SSD to have an usable experience with Android Studio, or configure it to run in laptop mode
I've had no problems using Android Studio on my Mac with 8GB. On a side note, the Android emulator even started faster than the iOS simulator. I also found it odd that the Android Emulator seemed to consume less resources than the iOS simulator which was taking up about 2GB of RAM.
Thanks, it's a one-time payment so I completely forgot about Android's developer account fee. And I'll add the other points too.
Edit: Also, how does Xcode compare in performance? It seems lighter, but I only have a pretty recent Macbook Pro to test on (which also handles Android Studio just fine)
> Also, how does Xcode compare in performance? It seems lighter, but I only have a pretty recent Macbook Pro to test on (which also handles Android Studio just fine)
Depends on what you open with it - for ObjC it's usually faster and smoother, for Swift it tends to be slower (about on par with AS Kotlin plugin) and for C++/ObjC++ it's horribly slow just like any other IDE out there :/
I read that as AR development having more in common with game development, which makes sense given the emphasis on performance, 3D rendering, and latency.
Yeah, I should have said it could benefit from using a fully-fledged game engine, considering you'll likely need to import 3D models, have objects interact with each other, and so on which Unity would help a lot with.
However, I primarily do experimental work in those engines, so I'm pretty biased.
VR also has a ton of non-game use cases. This is one of the biggest issues with VR marketing. But yes, as other commenters have mentioned, AR development still benefits from using a game engine (maybe rebranding 3D application framework would make it more palatable to "serious app developers") because you probably want do work with 3D models\rendering.
Xcode is the most "diff" IDE from other IDEs. I like Swift but just dont like Objective-C. The build tools and ecosystem is too tightly tied (I like to switch between development machines without having to always be on a mac).
Java is definitely painful, but I suppose the bias I have here is that I have developed on it for several years.
The breath of fresh air so far has been React-native and i wish more things get ported over to JS (or like Expo kit).
:-) - Strangely, I am looking forward to this. I think it just comes down to the fact that I am very comfortable with JS and node ecosystem and prefer that over all other mobile platforms atm. I also think maintaining one (almost ~80%) codebase for both platforms is a significant advantage.
Xcode is best for things like managing certs, not actually working in. The intellij version is pretty good; it integrates well with the build system and debugger.
I find it curious you bring up Expo as i've found that is the most opaque and user unfriendly IDE i've used in some time; i don't get why they don't just leverage VSCode and quality tooling over Yet Another Goddamn IDE.
Curiously, I feel the opposite way–Xcode's support for managing certificates is pretty awful (it's gotten better recently, but it still occasionally gives cryptic errors for no discernible reason). For actual programming, though, it works pretty well as long as it doesn't crash.
To clarify - Expo as a framework, and not XDE. I think they have made Expo eject a bit cumbersome but works with some wrangling. I like the the Expokit framework in general but don't want to be tied to the Expo's release chain.
If you mean XDE, it's not meant to replace VSCode or Sublime or anything. XDE just gives you buttons for common actions that you would normally do via the CLI.
Android Studio/IntelliJ may be excellent to you, and I'd agree the IDE is better than XCode as far as slinging code into an editor is concerned. However the Android development system isn't just the IDE: it's the build process, integration with debugging tools, and the APIs. Java doesn't have to be a bloated mess of over-engineered cruft. I've seen wonderful, clean Java. Android development is the opposite of that.
Kotlin may be great. I haven't used it much, but so far I'm finding it can't hide Android's bloated, over-engineered substructure.
The build process is significantly better in Android world once you actually take a look at how the build process works and need to start doing automated testing and deployment. Xcode tools for actual continious development beyond manual clicking are horrendous and waste a huge amount of people hours.
There's third party solution for iOS build automation - https://fastlane.tools. I didn't tried the whole end-to-end solution, but pieces I tried worked very well.
Unfortunately, the support for managing code-signing in Fastlane isn't quite fully-baked, and it can get messy. It's still really useful, though, if you have to deal with iOS builds.
We used that, but sadly it doesn't help with CI, constant simulator / connected device issues and breakages in toolchains when Apple releases new Xcode :(
Instant Run was buggy and made builds a lot slower, ironically the last time I tried it. That was after them saying that they fixed a bunch of issues so we should give it another chance.
I've been doing Android since 1.0 and started with iOS/Swift half year ago. I think the iOS platform is nicer, simpler, more thought out and overall a better experience, but not by a huge margin. Haven't used Kotlin yet though.
One area where iOS sucks is creating and (sometimes) working with UI.
I disagree; I think it made the response be entirely condescending and removed any point they had.
For what it's worth, I have about as much experience as they do, and I disagree with them. While the tools have improved, they still have a long way to go.
And? Without fail, whenever there is a post about Android on HN there'll be someone who miserably regales their tourist experience with Android. But that has zero professional relevance to anyone doing anything with any intention or focus at all. Actually it's worse than zero relevance, and is simply misleading noise.
Somehow many of us manage quite fine, and enjoy the experience.
In every other realm that sort of drive-by shooting gets rightly criticized. "Tried Rust -- => everywhere. Lame". "Tried vi. Couldn't quit. Garbage."
Sure, if the only tangible thing that the parent poster said wasn't that he's been on the platform for 9 years. The brain's an amazing thing - it can get used to almost anything, given enough time!
Congrats on making it 9 years. I've only been doing mobile development for 2 years, and using React Native for most of it. So I have an opinion that may be less biased toward a certain toolset.
I'd compare developing on Android very unfavorably to iOS. All the points the OP made are accurate, in my experience. Every time I needed to dive into Android native code, Layout Inflation, etc, I find it to be a crufty and unpleasant system. And gradle is really a pain to use - tons of edge cases and unhelpful error messages. Add to that, you need to support like ten thousand devices, many of which are running Android 4.4 (which, IIRC, is like three years old) and have a WIDE range of screen sizes.
Compare that to iOS development, and the differences seem obvious and apparent to me.
> Add to that, you need to support like ten thousand devices, many of which are running Android 4.4 (which, IIRC, is like three years old) and have a WIDE range of screen sizes.
I've been developing Android apps for years (before support lib even existed, much less all the fancy new kids) and range of screen sizes has literally never once been an actual issue. Use 'dp' instead of 'px' and you pretty much never have a screen diversity problem. All the platform views & layouts handle the majority of the work.
Does React Native make this more complicated for some reason?
Device support is a nightmare from a game development perspective if you're coding in C++ and using the GPU directly. Figuring out why the black box shader compiler is crashing on a particular device is not fun.
Perhaps screen sizes was a bad example of fragmentation (though, because the range is so much wider than on iOS, keyboard avoiding stuff can be unpleasant). Better examples would be subtle device-specific bugs. Like, off the top of my head, touch handling being handled differently for views on the Galaxy S5, making side swiping impossible. I can't confirm if that's RN specific or not, but I do know that I spent tens of hours on Android specific bugs that just were not present on iOS.
And perhaps RN was the culprit for some of these things. But definitely not all.
Same experience here. The API is the same, so what exactly is the problem? There's an occasional device-specific bug, but I usually don't spend much time on those.
Why is that anytime something comes out from Google (or just about anything, for that matter), people have to find something to complain about? There are more engineers at Google than the ones that were working on this project. Just because they are developing this project doesn't mean they stopped development on all other Android related projects.
So why don't we actually discuss the product instead of finding a way to shoehorn in an unrelated topic to complain about?
The problem isn't that someone mentions something else, the problem (from the standpoint of discussion of this story) is that comments like this resonate enough with people to get upvotes. That's actually Google's problem.
Even if you think it resonates because people are just drinking the Apple kool-aid, that's still Google's problem.
Let Google handle its own problems, and use whatever vote power you have to help steer the conversation here. Complaining about it doesn't exactly reduce the signal to noise ratio.
Android and iOS compete in the same ecosystem. It's extremely relevant, in my opinion, to compare and contrast their offerings. The release of a new product, library, or feature is an excellent time especially because of the freshness and potential additional relevance.
And I was talking about the product. ARcore is a single example of the larger, broader problems I "complained" about.
This is the view from the inside as a Google employee. You see all these lines between what you work on and what other people work on. From the outside? People see it quite differently. For an example you can maybe relate to, it's easy to criticize Apple for recent design decisions (touchbar, lack of escape button, etc) and say that the whole company is losing touch with reality. The people working on the iphone might feel the same way you do here: There are more engineers at Apple than the ones working on the touchbar.
I don't even care for Apple in general. If I were ever to develop these for commercial purposes, however, I'd go Apple all the way and consider Android as a secondary market. This reflects the quality of the development environments and the customer base (in terms of revenue). The only reason I write anything for Android is out of necessity (these are the devices we own).
Funny, because then you'd meet the amateur hour that's Xcode automation and continious integration. As soon as you graduate from a toy app to something you need to maintain for awhile, you find out just in what horrible state the Apple development tools are when it comes to UI testing, building in general and constant breakagaes of code when new iOS is released.
We literaly spent a month of man-hours every year to fix up constant breaking Xcode CI setup and iOS API breakages while Android team continued development unimpeded. Not to mention constant breaking changes and crashes of Swift language and IDE toolchain.
I do agree that for a complete beginner the experience is significantly better in Apple world, but please do not comment on "commercial purposes" development if you haven't done either.
> and constant breakagaes of code when new iOS is released […] We literaly spent a month of man-hours every year to fix up […] iOS API breakages
Unless you're referring to the Swift 2 -> 3 migration process, then I have to seriously question what you're doing that causes so much breakage with iOS version updates. With Obj-C there's usually just a couple of deprecation warnings to handle. With Swift (outside of the 2 -> 3 migration process) there may be a few more updates, due to the Swift SDK wrappers, which may be hard errors instead of merely warnings, but it's still usually pretty easy to fix. And if you are talking about the Swift 2 -> 3 migration, good news, you don't have to do that again!
Which makes me wonder, when you say "iOS API breakages", do you really mean you're using SPI, method swizzling, or subview diving and have to deal with the fact that you're doing something against the rules?
You can download the Xcode and iOS betas and try it out right now, on current hardware. And you can trust that all your potential users will have it by the end of the year.
I'd mostly agree, but the one massive pain point I have with Apple (iOS) development is automatability. You seem to need manual steps in the XCode GUI for everything, or at least if you don't, it's not well documented or generally known by the community how to set things up without manual UI steps.
Android's mess of Java cruft may be overkill, but it's at least well-documented text-file-configurable and can 100% be maintained without ever installing Studio.
This is no different. The Android development process is painful (the most verbose, cruft and boilerplate filled Java), cumbersome to organize and build (Gradle is terrible, and buggy) and debug (the integration with Studio is just clunky). About the only thing Google does better is testing releases through the developer console.
It's nice to see them finally providing something similar to ARKit. I just wish they'd work on all the other things that make Android development a horrible experience.