Angry Birds continues its explosive growth. Almost 21 months after its December 2009 launch, the bird-flinging game now boasts of 350 million downloads and over 300 million minutes of gameplay each day. And its merchandising side is seeing similar gains. According to Andrew Stalbow,Rovio’s General Manager for North America, monthly sales of plush toys and T-shirts are now in the millions. Not resting on its laurels, the franchise is readying two new games for release by the end of the year and is working on new features like geolocation to spicen things up.
Amazon Cloud Drive Photos, the photo-uploading utility that helps move photos from a mobile device into Amazon’s online storage, may have to change its name. Now, the tool doesn’t just support photo uploads, it supports videos, as well. Videos can be manually uploaded one by one, or users can opt to have videos auto-save from their devices directly into Amazon’s cloud.
This automatic upload option was already available for photos through an update out at the beginning of the year, but videos within Cloud Drive Photos had not yet been supported, whether manually or through the auto-upload feature within the application.
Amazon says that videos are restricted to 2 GB in size or 20 minutes in length, whether they’re being uploaded or downloaded from the Cloud Drive service – that’s slightly longer than YouTube’s default setting ahead of account verification. This is fine for the majority of users’ personal videos, of activities, pets, kids or events, for example, recorded on their mobile devices.
After the files are in Amazon’s cloud, the video can be played back to any device, including, of course, the Kindle Fire and other Android tablets. According to a post on Amazon’s Web Services blog about the technical underpinnings to the new feature, Amazon’s Elastic Transcoder service was used, which supports over 20 file formats and 40 video codecs. The team says its goal was to have videos transcoded within 15 minutes after uploading, but ended up achieving videos that are often ready within a minute or two. They also went ahead and processed all the videos stored in Amazon users’ Cloud Drive libraries ahead of launch.
Though the company offers a version of its Amazon Cloud Drive Photos app on iOS devices, too, only the Android version has received the video support at this time. That makes sense because not only is the Kindle and Android-based tablet, and therefore Amazon’s priority, the Android app was also the first to launch, back in November 2012.
The iOS version didn’t arrive until this May, and it serves as a viable alternative to Apple’s own iCloud sync and storage service, with reasonable pricing of 5 GB for free, then $10/year for 20 GB, $25/year for 50 GB and so on, all the way up to 1,000 GB for $500/year. Keep in mind that the storage goes up so high not because users need so much space for photos (and now videos, too), but because Amazon Cloud Drive is meant to serve as a competitor to Google Drive or Dropbox, with support for a variety of file types, including office documents and music, which can also be streamed back through Amazon Cloud Player.
In other words, this isn’t the first time users could upload videos to Amazon’s cloud. This is just making it possible to do so within the Cloud Drive Photos application.
The updated Cloud Drive app is available now on Google Play and Amazon’s Appstore.
Bored of quantifying your self already? Why not quantify your pet instead? FitBark is a Fitbit style health tracker for your under-walked canine companion. We’ve covered this (frankly) barking mad gizmo before, back in May, when its creators were exhibiting at Hardware Alley at TechCrunch Disrupt NY but they’ve now taken to Kickstarter to raise funds to get the device out in the wild. Again.
It’s actually FitBark’s second attempt at Kickstarting the gizmo. As Gigaom points out, its creators pulled an earlier attempt at crowdfunding the device in order to rethink the business model, scrapping the monthly subscription fee and opting for a fixed price-tag of $69 via Kickstarter or $99 for general retail.
FitBark are after $35,000 to cover manufacturing costs this time around, and are more than half-way to achieving the target with 32 days left to run on the campaign – so crazy or otherwise, this is one hardware startup that’s pretty much a dead cert for its first manufacturing run-around-the-park at least.
Now I say barking mad but that’s mostly tongue-in-cheek, being as FitBark is not the only health tracker angling for pet owners’ cash. Whistle, a startup backed by $6 million in Series A funding, launched a $99 wearable activity tracker for dogs only last month. There’s also Tagg, which combines activity and location tracking by including GPS in its device. So underestimate the pet-owning dollar at your peril.
So what does FitBark actually do? Attach it to your dog’s collar and it tracks daily’s activity levels, sending the data back to FitBack’s servers when your smartphone is in range, or throughout the day if you purchase a dedicated FitBark base station (and keep you pet penned up at home while you’re out). The latter scenario would allow owners to keep remote tabs on their pet’s activity levels when they’re not at home, but unless you own a mansion (or employ a dog walker) your dog isn’t going to be able to do a whole lot of running around without you. FitBark then crunches all the activity data, offering customisable daily activity goals, and delivering the results back to you via an app. So far, so kinda sane.
At its more barking mad fringe, the FitBark also lets pet owners compare – well, they say “unify” – their own fitness with their dog’s fitness/activity. So yeah, boasting that you are fitter than Fido is apparently a thing now…
FitBark is also the first platform that leverages existing APIs of human fitness trackers to bring you a unified view of your fitness level and that of your dog. From the outset, FitBark will seamlessly receive input from your Nike Fuelband, Fitbit, Withings Pulse, or Bodymedia Fit. We’ll look to expand the list as we learn about new open APIs or partnership opportunities. If you’re not only a devoted dog parent but are also serious about tracking your own fitness, you’ll love this.
Skype’s latest app upgrade brings a few substantial features, some good, one not so. Alongside a new anti-shake video call function (limited to the iPhone’s back-facing camera), you can now pair Bluetooth headsets with the VoIP calling service, something apparently “long requested” from Skype fans. However, users have to fork out for credit to avoid seeing advertising that’s also baked into the new version. The update’s now up for grabs on both the iPhone and iPad, though there are reports of a few teething troubles, including missing credit and account details. We’ve also been experiencing issues, with the app unwilling to play nice with our Bluetooth headsets, though oddly, we can still hear the Skype call ring through. Hopefully we’ll see another update that sorts this out soon.
Before Instagram Direct was unveiled, I had reservations about how private messaging would be integrated into the platform.
The pure, immediate joy of Instagram comes from its simplicity, both in terms of its design and the sharing it facilitates. You capture a photo or short video clip, then post it to a stream where others can view and comment on it. For all of the tweaks and updates that have rolled out since Facebook’s acquisition of Instagram, that core has always stayed the same.
The prospect of a traditional messaging experience, similar to mobile apps such as WhatsApp, LINE, or even Facebook’s own Messenger app, filled me with dread. The thought of sending quick messages back and forth, with the odd photo or video for good measure, felt counter-intuitive to Instagram’s DNA. A useful feature, perhaps, but one better served by an array of existing apps.
Thankfully, Instagram opted for a simpler implementation. You still have the option of publicly posting a photo or video, but it’s now joined by Direct, which lets you create a private thread with up to 15 people. These discussions aren’t like other messaging apps though, with an endless stream of text bubbles flanking the left and right-hand side of the display. Instead, the company has stuck to its roots by subtly repurposing the UI of any normal Instagram post.
That means placing the original photo or video on a pedestal. So when you first open a private thread in your inbox, you’ll always see the shared moment at the top of the page. Furthermore, no-one else can post a photo or video without starting a new, private thread inside Instagram.
That might seem a little basic, annoying or short-sighted, but the impact on the UX is profound. Instagram is giving users another place to discuss their photos and videos, instead of facilitating broader conversations usually reserved for other mobile messaging apps. That helps to protect the core premise of Instagram, which has always been about sharing and commenting on interesting moments. Admittedly, you can write about anything in an Instagram Direct thread, but because subsequent comments are always tied to a specific photo or video, the feature subtly discourages you from going too far off topic.
Instagram isn’t Snapchat
The experience is tailored to Instagram and far-removed from that found on Snapchat, the red-hot messaging app that lets users send and view content for a limited period of time. Now, Instagram could have offered a timer for private messages, but in my opinion that would have derailed its original intentions.
Instagram Direct is about more than just viewing a photo or video privately – it’s about the conversations that occur afterwards. I have relatives based halfway around the world, and if they choose to share an important moment with me using Instagram Direct, it’s important that I can see it for more than 10 seconds. Otherwise, if I submit a comment and carry on with my day, I won’t be able to access the photo or video when they reply later on.
In that scenario, any meaningful discussion would be impossible. That’s not to say Instagram couldn’t be successful as a Snapchat clone – I’m sure young people would happily use it for sexting – but such a feature would fly against what makes Instagram such a delightful app.
Instagram is popular because it creates conversations. It’s a simple way not only to share photos and videos, but also to comment and leave feedback on those posted by other people. (Even if that just means leaving a couple of likes on someone’s profile page). The UI has been designed to make that feedback loop as quick and simple as possible – Instagram Direct is simply an extension of that; sharing moments and then talking about them. Only this time, it’s privately with a smaller group of people.
When it comes to photo-oriented web services, for every slam-dunk success like Instagram, there seem to be quite a few Color-esque shutdowns. And today, another photo startup is hitting the deadpool.
Everpix, the San Francisco startup that launched its cloud-oriented photo storage and sharing platform at SF Disrupt in 2011, announced today it is shutting down. Starting now, new sign-ups and subscriptions are not available, and Everpix apps will switch to read-only mode. Operations will cease completely on December 15th, 2013. Everpix says that it will email its users with full details regarding exporting their photos and obtaining refunds.
The company, which had raised $1.8 million from investors including Index Ventures and 500 Startups, has six staff listed on LinkedIn. In a post on Everpix’s official blog today entitled “We Gave It Our All…”, the company said that the shutdown comes as a result of failing to secure more capital or find an aquirer:
“It is with a heavy heart we announce that Everpix will be shutting down in the coming weeks.
…It’s frustrating (to say the least) that we cannot continue to work on Everpix. We were unable to secure sufficient funding in order to properly scale the business, and our endeavors to find a new home for Everpix did not come to pass. At this point, we have no other options but to discontinue the service.”
Casey Newton wrote a great post-mortem of Everpix for The Verge today, which you can read here. I’ve reached out to Everpix’s CEO Pierre-Olivier Latour for additional comment on the shutdown, and will update this post with any response.
Everpix really had created a beautiful app – you can see a demo of one of the most recent versions in this video interview from this past March – but it seems it just wasn’t enough to keep things alive. Kudos to the Everpix team for being honest about its shutdown story and the difficulties the company faced in its final days.
Do you have an idea for an app but lack the programming knowledge to begin building it? In this weekly blog series, I will take you, the non-programmer, step by step through the process of creating apps for the iPhone, iPod touch, and iPad. Join me each week on this adventure, and you will experience how much fun turning your ideas into reality can be! This is Part 23 of the series. If you are just getting started, check out the beginning of the series here.
When I left you last, I had handed out a homework assignment to put iOS app development skills you have learned so far to the test. Your goal was to covert the My Reviews and Review scenes in the iAppsReview app from prototypes to fully functioning views. In this post, I’ll go through the steps for you and I recommend comparing them to the steps you took in completing the homework assignment.
Figure 1 shows how the two scenes will look after completing the steps in this post.
|Figure 1 – The completed My Reviews and Review scenes at run time|
If you need the latest version of the iAppsReview project, you can download it from this link. If you run into any troubles along the way, you can get the completed project for this post at this link.
Let’s get started.
Converting the My Reviews Scene
The first step in converting My Reviews scene to a fully functioning scene, is to change the table view from static to dynamic cells.
- In Xcode, open the iAppsReview scene.
- In the Project Navigator, select the MainStoryboard.storyboard file.
- Scroll to the My Reviews scene in the storyboard.
- Click in the gray area below the table view to select the table view.
- Go to the Attributes Inspector, and change the Content type to Dynamic Prototypes.
- Since all the cells in this table view have the same style, we can delete all the cells except the first cell. To do this, click on the second cell, hold the Shift key down, then click on third, fourth, and fifth cells. With the last four cells selected, press the Delete key. This leaves one cell remaining as shown in Figure 2.
|Figure 2 – Leave just one cell remaining.|
- With the remaining cell in the table selected, go to the Attributes Inspector and set the cell’s Identifier attribute to ReviewCell (Figure 3).
|Figure 3 – Set the cell’s Identifier to ReviewCell.|
Creating the My Reviews Table View Controller
For our next step, we need to create a table view controller that we can use in conjunction with the My Reviews scene to fill and manage the table view.
- In the Project Navigator, right click the iAppsReview group, and select New File… from the popup menu.
- On the left side of the New File dialog under the iOS section, select Cocoa Touch. On the right side of the dialog, select the Objective-C class template (Figure 4), and then click the Next button.
|Figure 4 – Create a new Objective-C class.|
- In the next step of the New File dialog, set the Class to MyReviewsViewController and set the Subclass of to UITableViewController (Figure 5).
|Figure 5 – Create a new MyReviewsTableViewController class as a subclass of UITableViewController|
- In the Save File dialog, click the Create button. This adds the new class files to the Project Navigator (Figure 6).
|Figure 6 – The new table view controller files|
- As soon as you create a new table view controller, you should immediately associate it with the scene (because it’s easy to forget). To do this, go to the Project Navigator and click on the MainStoryboard file. Then go to the My Reviews scene and click on the status bar at the very top of the scene to select the table view controller. Next, go to the Identity Insepctor (the third button from the left in the Inspector toolbar) and change the Class to MyReviewsViewController (Figure 7).
|Figure 7 – Set the view controller class to MyReviewsViewController.|
Now we’re ready to add code to the new class to fill and manage the My Reviews scene’s table view.
Setting Up the Table View Controller
We are going to need the services of the Review business controller class to retrieve ReviewEntity objects from the database, so let’s start there.
- In the Project Navigator, select the MyReviewsViewController.m file.
- At the top of the MyReviewsViewController.m file, add the import statements shown in Figure 8.
|Figure 8 – Add these import statements.|
- We need a place to store the Review business controller object and the ReviewEntity objects that are retrieved from the database, so add the instance variables shown in Figure 9.
|Figure 9 – Add these instance variables.|
- Next, add the code shown in Figure 10 to the bottom of the viewDidLoad method (you can delete all existing comments in viewDidLoad first).
|Figure 10 – Add this code to the viewDidLoad method.|
Next, we need to implement the table view data source methods in the view controller.
- In the numberOfSectionsInTableView: method, delete the #warning declaration and add the code shown in Figure 11.
|Figure 11 – Add this code to the numberOfSectionsInTableView: method.|
- In the tableView:numberOfRowsInSection: method, delete the #warning declaration and add the code shown in Figure 12.
|Figure 12 – Add this code to the tableView:numberOfRowsInSection: method.|
- In the tableView:cellForRowAtIndexPath: method, change the code as shown in Figure 13.
|Figure 13 – Add this code to the tableView:cellForRowAtIndexPath: method|
Testing the My Reviews Scene
Now we’re ready to check out our code to see how it works.
- Click Xcode’s Run button.
- When the app appears in the Simulator, first tap the Write a Review option to create a review and post it.
- Next, in the main iAppsReview scene, select the Read Your Reviews option, and you should see all of the reviews that you have added so far (Figure 14).
|Figure 14 – The My Reviews scene at run time.|
You have met with success!
Converting the Review Scene
Before leaving the My Reviews scene in the Simulator, click on one of your reviews in the list and you will be taken to the Review scene. This scene is still a prototype and displays static information about the Doodle Jump app.
- Click Xcode’s Stop button.
- In the Project Navigator, select the MainStoryboard file and scroll to the Review scene. All of the UI controls we need have been placed on the scene and are ready to accept live data. There’s nothing else we need to do to the user interface of this scene right now. I just wanted you to take a look at it before we begin.
Creating the Review View Controller
For many of the other scenes we have converted, we created an associated table view controller. However, since this scene doesn’t contain a table view, we can create just a plain view controller.
- In the Project Navigator, right-click the iAppsReview node and select New File… from the popup menu.
- On the left side of the New File dialog under the iOS section, select Cocoa Touch.
- On the right side of the dialog, select Objective-C class template and click the the Next button.
- In the next step of the dialog, set the Class to ReviewViewController and set Subclass of to UIViewController (not UITableViewController) as shown in Figure 15.
|Figure 15 – Create a new ReviewViewController class.|
- Click the Next button, and in the Save File dialog, click the Create button. This adds two new class files to the Project Navigator (Figure 16).
|Figure 16 – New view controller files|
- Now let’s associate the new view controller with the Review scene. In the Project Navigator, select the MainStoryboard file. Afterwards, click the status bar at the top of the Review scene, then go to the Identity Inspector and set the Class to ReviewViewController (Figure 17).
|Figure 17 – Set the Class to ReviewViewController.|
Setting Up the Review View Controller
Let’s start out by adding outlets to the user interface controls so we can access them from the view controller.
- Turn on the Assistant Editor by going to the top of the Xcode window and clicking the center button in the Editor button group.
- Click on the Review scene in the storyboard. If the ReviewViewController.h file is not automatically displayed in the Assistant Editor, go to the jump bar at the top of the Assistant Editor, click the Manual button and select Automatic > ReviewViewController.h.
- At the top of the ReviewViewController.h file, add the import statement shown in Figure 18.
|Figure 18 – Import mmStarRating.h.|
- Click on the Doodle Jump label in the Review scene to select it, hold the Control key down and then click and drag down into the ReviewViewController.h file as shown in Figure 19.
|Figure 19 – Create an outlet for the label.|
- When you see the Insert Outlet or Outlet Collection popup, let go of the mouse button and Control key.
- In the Create Connection popup, set the Name to lblAppName and then click Connect. This adds a new outlet property to the code file as shown in Figure 20.
|Figure 20 – The new outlet property|
- Now create an outlet for the other controls in the scene. Give the outlets the following names:
- Entertainment label – lblCategory
- Five Star Rating control – starRating
- Comments text view – tvwComments
- Image view – imgThumbnail
When you’re finished, the ReviewViewController.h should look like Figure 21.
|Figure 21 – The completed outlet properties|
Passing Data to the Review View Controller
In a previous post, I discussed how to pass data between view controllers. The scenario with the Review view controller is a little easier, because we only need to pass data one way. We need to pass the currently selected review’s information from the My Reviews scene to the Review scene. However, since the Review scene doesn’t allow editing, we don’t need to pass any information back.
By way of review, here are the three main steps we need to perform to pass data to a view controller.
- Create a property on the destination view controller to hold the data being passed by the source view controller.
- Configure the segue between the source and destination view controllers.
- In the source view controller, implement the prepareForSegue: method and add code that stores the data to be passed to the destination view controller’s property.
In this scenario, it makes sense to pass a ReviewEntity object from the My Reviews view controller to the Review view controller.
- Close the Assistant Editor by clicking the left button in the Editor button group at the top of the Xcode window.
- In the Project Navigator, select the ReviewController.h file and add the import statement and property declaration shown in Figure 22.
|Figure 22 – Add the import statement and property declaration.|
- In the Project Navigator, select the ReviewController.m file and add the code shown in Figure 23 to the viewDidLoad method. Notice we left out the code that stores information in the image view control and the category label. This requires some special code, so we’ll get to these items in a future post!
|Figure 23 – Add this code to the viewDidLoad method.|
Our next step is to configure the segue between the My Reviews and Review scenes.
- In the Project Navigator, select the MainStoryboard file.
- Click on the segue between the My Reviews and Review scene. Notice that when you do this, Xcode highlights the Doodle Jump row in the table view (if it doesn’t, you have the wrong segue selected). This is because the segue is currently hard-coded to this specific row. Now that we are converting the app from a prototype, we need to delete this segue and create a new one.
- Press the Delete key to delete the segue.
- Click the status bar of the My Reviews scene to select the table view controller.
- Hold the Control key down, and then in the scene dock below the My Reviews scene, click the view controller icon on the left. Drag your mouse pointer to the Review scene until the scene is highlighted in blue.
- In the segue popup, select Push.
- Click on the segue in the storyboard to select it, and then go to the Attributes Inspector and set the Identifier to ReviewSegue as shown in Figure 24.
|Figure 24 – Set the segue Identifier to ReviewSegue.|
Now let’s to the source view controller and implement the prepareForSegue method.
- In the Project Navigator, select the MyReviewsViewController.m file.
- At the top of the code file add the import statement shown in Figure 25.
|Figure 25 – Import ReviewViewController.h|
- Add the prepareForSegue method shown in Figure 26 directly below the viewDidLoad method.
|Figure 26 – Add this prepareForSegue method to the ReviewViewController.m file.|
- Now scroll to the bottom of the code file. Delete all of the existing comments from the tableView:didSelectRowAtIndexPath: method and replace it with the code shown in Figure 27. This code triggers the segue when the user taps a row in the table view.
|Figure 27 – Add this code to the tableView:didSelectRowAtIndexPath: method.|
That’s it! You have added all the code you need to pass data from the My Reviews view controller to the Review view controller.
Testing the Review Scene
Now we’re ready to take it for a spin.
- In Xcode, click the Run button.
- When the app appears in the Simulator, select the Read Your Reviews row.
- Select one of the rows from the list and you should see its information displayed in the Review scene as shown on the right side of Figure 1 at the beginning of this post!
I recommend that you add several reviews with different ratings, comments, and so on, then come back to the My Reviews scene and check them out. REMEMBER, as the app stands right now, you won’t see a change in the Category or the image, but we’ll address this in a future post.
So how did you do? It’s amazing what you can learn when you take off the training wheels and try it on your own. If you’re a little foggy on the steps outlined here, I recommend going through them a few times until they make sense. As always, you can ask any questions you have in the comments for this post!
The Oculus Rift virtual reality headset isn’t even available for consumers to buy yet but here comes the cut-price competition… While the Rift development kit will set you back $300 – and still requires a PC to do the gaming horse-work – vrAse, a soon-to-be-launched-on-Kickstarter project, is approaching virtual reality from another direction. It wants to turn your existing smartphone into a pair of wearable virtual reality/3D specs. And do so for as little as 48/$75.
Since high-end smartphones are powerful computers in their own right, and come furnished with cameras front and back, why not just stick your phone right on your face, right? Provided you don’t mind looking like Mr Phone Face, of course. vrAse is one part Oculus Rift, one part Google Glass, one part sci-fi ski goggles – with gaming, 3D movie-watching and augmented reality use-cases envisaged by its creators, assuming they can get developers to make the apps to go with their goggles.
At launch there’s clearly not going to be a lot of ready to rock apps but they say they will offer demo content to show off vrAse’s AR and 3D gaming capabilities. Plus, any movies already made for 3D can also be downloaded or streamed in Side by Side format (SBS) for viewing on vrAse. And films and games can also apparently be converted to SBS for viewing on the device.
vrAse is effectively a toughened smartphone case, attached to a pair of wearable goggles. Your existing smartphone slides inside the case so you’re looking directly at its screen through vrAse’s dual lenses – which generate the 3D/immersion effect. And that’s pretty much it. Compatible smartphones at launch are the iPhone 5, HTC One, Xperia Z, Galaxy S3, Galaxy S4 and Galaxy Note 2. In future the creators say they will make it compatible with any smartphone.
How immersive will vrAse be? That’s the key question. And the answer will depend (in part) on the smartphone screen you’re pairing it with. The higher the screen res, the better looking the picture will presumably be. Beyond that, vrAse’s creators aren’t going into detail about what sort of field of vision to expect from vrAse so it’s hard to say how it will stack up against the likes of Oculus Rift. It is looking considerably cheaper to buy, however, so set your expectations accordingly. Update: vrAse says the range of vision is configurable but currently the device offers more than 105 degrees of binocular vision field.
vrAse’s makers are hoping to raise 55,000 via Kickstarter. If they hit their target they’re aiming to ship to backers in February. Their crowdfunding campaign kicks off on Saturday.