Tag Archives: iphone

Well, obviously.

Yet another nice photo by iPhone. You’re so convenient!

I kinda like that look, actually.

In all seriousness that pane of glass isn’t supposed to be smashed into a squillion little pieces, but it kinda gives it that frosted-glass look (without spoiling the overall UTas logo) which I kinda like.

A (mostly subjective) review of Essay, the rich text editor for iOS

Essay is more fancy. Essay isn’t perfect either, but first, here’s what it does do. Firstly, it’s a text editor. Off to a great start there. There’s no accompanying web-app for easy access to my writings away from the iPhone, but it does sync the HTML-formatted files to Dropbox which is fine. For a writing app, it has some pretty advanced features. Things like rich-text, combined with pretty standard HTML stuff like lists (ordered, unordered), sections, paragraphs, and so on like in the screenshot above. You can create hyperlinks to other files within Essay, or to actual web locations. Bold, italic, underline, strikethrough — all present, all easily accessible through the custom keyboard add-on (which works brilliantly, by the way. Fluid, simple, all-round excellent implementation that could have been very convoluted indeed). The iPhone app is very new (just released today, in fact), so there’s no word count (that I can find, although I assume it is in the iPad version), and it even has a full blown web browser in-app to allow you to browse links.

The number one issue I have with Essays is that it isn’t fully native. Not that it’s a web app or anything like that, it isn’t, but that the main editor view is basically some glorified HTML interpreter. Let me tell you what I mean. While the editor tries its very hardest to appear native (at least it includes the standard text selection tools), it’s unlike any other editable text section I’ve ever seen. This is evidenced by a couple of factors:

  1. The marquee tool that appears when you tap and hold to finely place the cursor makes text look pixellated. I’ve never seen that before, on any app. Example: Essay, Simplenote. It’s not just the iPhone version that does this, it looks just as bad on the iPad too.
  2. The cursor blinks at a different rate, in a slightly different way to any other cursor I’ve seen.
  3. Selecting text is sloooow. There’s a lag associated with everything.

Points one and three lead me to the conclusion that it must be some sort of emulated HTML interpreter, not the native iOS text view that I know and love. I’m guessing it’s this non-native text view that allows such advanced features as text styling, lists, highlighting, and so on, but it’s also this non-native text view that has some serious drawbacks:

  • Selecting a part of text, hitting “select all” frequently results in the standard “cut copy paste” popup not popping up. Selecting a few words works, however.
  • Tapping the time doesn’t take you to the top of the current scroll view, like it does pretty much everywhere else.
  • Points one and three above — sure, pixellation is just a cosmetic issue, but having slow text selection is a functional one. It’s also terrible UX, that’s how much the cursor-placing lags.

It’s these small things that make Essay not what I’m looking for at the moment. Sure, it’s fancy — but when fancy comes at the cost of serious drawbacks I’ll have to turn down that particular offer.

via iOS Reviews.

Written by yours truly, of course.

A slightly-overexposed photo of Domokun

Slightly over-exposed in the top rightLEFT corner, but otherwise I think it turned out okay. I had a different one under fluorescent lights but I like this one better; brings out the texture of the fabric of Domo whereas the one under fluoros was pretty bland.

As an aside the image  number is 1337. Yes, I had a little giggle over that one.

EDIT: I get my right and left confused without meaning to.

Hello, Melbourne.

See those ugly lines running all over the screen? Exactly what I hate about Melbourne. Sometimes a guy just wants his view of the sky to be unimpeded, you know?

Photo taken with iPhone 4 with in-camera HDR. Straight from the camera, no editing.

Dell SP2309W — 2048×1152 what now?

I spose the iphone4 would be a good subjective test of screen tech like this – Cramming relatively big res into tiny screens.

Er, no, no it wouldn’t.

Back story: there’s a pretty nice screen on that Dell makes. It’s the SP2309W, and for $279 you get a 23″ TFT Dell monitor that does 2048×1152, higher than high definition (but still at a ratio of 16:9).

I pointed out this monitor to a couple of my friends, and one made the comment you see above (along with something about a weird resolution for a computer monitor).

Before I continue I’d like to point out that most of this is a re-hash (albeit a pretty poor one) of Dustin Curtis’ thoughts on the issue — I’d suggest you go read his blog first, and then come back here when you’re done.

And that’s exactly where he’s wrong. It’s not like the iPhone 4, because while the iPhone 4 crams a relatively big res into a smallish screen, it does so in a way that doesn’t affect the size of on-screen elements.

Traditionally, what happens is that as pixel density gets higher, user interface elements get smaller. It’s got something to do with how large any specific UI element actually is, and how text has been traditionally rendered.

Over at his blog, Dustin explains:

This means that if you draw the letter “a” in 12pt Helvetica on any screen, it will take up exactly 8×9 pixels (almost all the time). As you increase the number of pixels on the whole display, the number of pixels that it takes to draw the letter “a” in 12pt Helvetica stays the same, the letter just becomes smaller.

More pixels crammed into a smaller space (that is, a higher pixel density), results in things becoming smaller. If you think about it, it makes sense — say you’ve got an image that’s 512×512, the size of an typical Mac OSX application icon. If your screen displays that at, say, 100ppi, it’ll appear to have certain dimensions on the screen if you chose to measure it with a ruler. Measure that same icon on a 130ppi screen, and it’ll appear smaller. Not because it’s lost any pixels, but because those same pixels have been jammed into a smaller space.

Then you hit the iPhone 4. It’s not quite resolution independence*, but what Apple have done works pretty well. Instead of using the same graphics resources as the iPhone 2G/3G/3GS, developers are encouraged to develop “retina-optimised” graphics — that is, graphics at double the resolution of their previous-generation iPhone counterparts. Why? Because such graphics will increase interface definition.

If you take that same icon that we had in above example, and instead of just scaling it up or down to suit different resolutions, what you can actually do is create a whole new version of that icon so that it displays at the same physical size — regardless of which screen you display it on. Obviously the icon will look vastly improved on a higher resolution display compared to the lower resolution one, but that’s only because we’re increasing image density alongside pixel density.

Dustin, again, sums it up best:

This means that when iOS scales the elements in physical size to fit the 3.5-inch iPhone 4 screen, they take up the same amount of space as the elements drawn on the iPhone 3GS but they use four times the number of pixels.

Four times the number of pixels, represented in the same physical space = incredible user interface definition.

If that’s not mind-blowingly awesome, I’m not sure what is.

The whole “retina display” mentality of the iPhone is not about representing more things in the same space —  it’s about showing the same stuff, just at a better quality. Contrast this to the display above — because whatever you use on that display (Windows, or Mac) isn’t resolution independent (Mac OSX is to a degree), things will appear smaller, and that’s just how the cookie crumbles.

* okay, it’s not resolution independence at all. Without getting too technical, Apple are actually using two sets of graphics resources for everything — apparently they found that ahead-of-time resolution independence offered the greatest performance/resource benefit. More reading available here on the matter (thanks, Bjango!).