Wednesday, October 3, 2012

rubyu motion autoresizingMask


view.autoresizingMask

This property is often neglected by beginner iOS programmers and is by far one of the most useful for laying out your interfaces. Consider the situation where you want your view to resize when the device is rotated between portrait and landscape orientations. The novice programmer will detect the screen rotation and will explicitly set the “frame” property of all their views to fit the new screen size. This is the WRONG way to manage your interface.
Instead you can use the “autoresizingMask” property to describe how you want your view to adapt to orientation changes. Your options (shortened for brevity) are:
  • None
  • FlexibleWidth
  • FlexibleHeight
  • FlexibleLeftMargin
  • FlexibleRightMargin
  • FlexibleTopMargin
  • FlexibleBottomMargin
eg:- in Rubymotions

    
view.autoresizingMask = UIViewAutoresizingFlexibleWidth | UIViewAutoresizingFlexibleHeight | UIViewAutoresizingFlexibleLeftMargin | UIViewAutoresizingFlexibleRightMargin | UIViewAutoresizingFlexibleTopMargin | UIViewAutoresizingFlexibleBottomMargin

Autorotate in Rubymotions


Add

in rake

require 'motion-cocoapods' and app.interface_orientations = [:portrait]

IN controller 

def shouldAutorotateToInterfaceOrientation(*)

    @orientation=Device.orientation


    return UIInterfaceOrientationLandscapeLeft
  end


in  @orientation you will get orientation

RUBY MOTION remove statusbar,


TO REMOVE STATUSBAR

use

 UIApplication.sharedApplication.setStatusBarHidden:Yes




Monday, July 16, 2012

What’s the Difference between the Accelerometer and the Gyroscope?

Measurements

The accelerometer measures foce in the X, Y, and Z dimensions. Think of holding the device directly in front of you… moving the device vertically in space changes the y value, horizontally changes x, and moving the device towards and away from you changes the z value.
The gyroscope measures orientation in the a, b, and g dimensions. Think of holding the device directly in front of you… moving the device like you would steer a car changes the Alpha value, tilting the device forwards and backwards will effect the Beta value, and twisting the device like you’re opening a soda will change the Gamma value.

Force vs. Orientation

When executing the codeblocks below, you’ll notice that the X, Y, and Z values will read zero, or close to zero, when holding the device motionless. This is because the accelerometer only outputs the force of a a change in orientation, not the orientation itself. Modern browsers also supply the “accelerationincludinggravity” property when emitting the ondevicemotion event which works differently and we will not be using it in this post.
Unlike the accelerometer, all of the channels of the gyroscope will continuously read the orientation of the device, even when motionless. Try moving the phone in space without any twisting force… you’ll notice little to no effect in the gyroscopic values.

The Code

I’m starting with an entirely blank html file. The only things we’ll need to include are a reference to Zepto and a series of DOM elements to hold the textual information we’ll be logging out. You can download the final zip package here – http://joelongstreet.com/blog_files/ios_accelerometer_1/package.zip.
All I’m doing here is associating DOM elements with JavaScript variables. Nothing special.





var x_dom = $('.x');
var y_dom = $('.y');
var z_dom = $('.z');
var a_dom = $('.a');
var b_dom = $('.b');
var g_dom = $('.g');
Accelerometer – The “ondevicemotion” event is native to browsers that support the accelerometer. We can use this event to constantly track the changes in motion related to the X, Y, and Z dimensions. After collecting the data from the event, we write the information to the DOM.








window.ondevicemotion = function(event) {
    var x = event.acceleration.x;
    var y = event.acceleration.y;
    var z = event.acceleration.z;

    x_dom.text(x);
    y_dom.text(y);
    z_dom.text(z);
}


Gyroscope – We’re doing the same thing here with the gyroscope. Like the accelerometer, this method is native to devices that support the respective
instrument.










window.ondeviceorientation = function(event) {
    var a = event.alpha;
    var b = event.beta;
    var g = event.gamma;

    a_dom.text(a);
    b_dom.text(b);
    g_dom.text(g);
}



If you log the above code, you’ll notice the instruments are very, very sensitive and the strings returned are somewhat long. Since it’s kind of difficult to look at that kind of information in the DOM, I’ve added a rounding function to display a single whole number. Also, adding a + sign to the string makes switches between positive and negative numbers less visually striking.
The Final Product:




















var x_dom = $('.x');
var y_dom = $('.y');
var z_dom = $('.z');
var a_dom = $('.a');
var b_dom = $('.b');
var g_dom = $('.g');

//This is only for devices that support the use of a gyroscope (iPhone 4, iPad2, iPod Touch)
window.ondeviceorientation = function(event) {
    var a = Math.round(event.alpha*1/1);
    var b = Math.round(event.beta*1/1);
    var g = Math.round(event.gamma*1/1);
    if(a >= 0) { a = '+' + a}
    if(b >= 0) { b = '+' + b}
    if(g >= 0) { g = '+' + g}
   
    a_dom.text(a);
    b_dom.text(b);
    g_dom.text(g);
}

window.ondevicemotion = function(event) {
    var x = Math.round(event.acceleration.x*1/1);
    var y = Math.round(event.acceleration.y*1/1);
    var z = Math.round(event.acceleration.z*1/1);

    if(x >= 0) { x = '+' + x}
    if(y >= 0) { y = '+' + y}
    if(z >= 0) { z = '+' + z}
   
    x_dom.text(x);
    y_dom.text(y);
    z_dom.text(z);
}
Finished, not a whole lot to it and pretty easy overall. I’m really looking forward to seeing how people use these new technologies to build web applications. Note: In real life scenarios it’s probably overkill to collect/broadcast information on device motion and orientation change, you’ll want to use some kind of intervalled data collection to prevent the browser from killing itself.

Monday, July 9, 2012

Motion sensing in the iPhone 4: MEMS accelerometer

The power and attractiveness of Apple's iPhone 4 lies in the sophisticated integration of multiple sensing technologies.  Of specific note is the integration of a full 9 degrees-of-freedom (DoF) motion sensing.  The iPhone 4 is the first portable consumer device to incorporate a three-axis accelerometer, three-axis gyroscope, and three-axis electronic compass.  The addition of these sensors allows for much better rotational motion sensing, gaming, image stabilization, dead reckoning for GPS, gesture recognition and other applications, than was possible with only an accelerometer.
The iPhone family of products has been evolving towards the goal of full 9DoF motion sensing.  The original iPhone, which was launched in June 2007, incorporated only a STMicroelectronics LIS302DL accelerometer, while the iPhone 3G incorporated an STMicroelectronics LIS331DL accelerometer, both corresponding to 3DoF sensing. The iPhone 3GS incorporated an STMicroelectronics LIS331DL accelerometer and an AKM AK8973 electronic compass, thus providing 6DoF sensing. The iPhone 4, released on June 24, 2010, featured full 9DoF motion sensing, plus three microphones, two image sensors, ambient light and proximity sensors, and the archetypal touch screen sensor.
Chipworks has completed a full analysis, down to the silicon, of the three motion sensors found in the iPhone 4. In this three part series, we present some highlights from our analysis, with specific focus on the MEMS sensors, how they are made, and how they work.  This article will review the results of our teardown of the iPhone 4 and will include a discussion of the three-axis accelerometer. The next article will provide the results of some of our analysis on the three-axis gyroscope used in the iPhone 4, while the third article will provide a review of the three-axis electronic compass technology.
iPhone 4 Teardown
Figure 1 below shows the top side of the main printed circuit board. The STMicroelectronics LIS331DLH accelerometer and the L3G4200D gyroscope devices are placed side-by-side, adjacent to the Apple designed but Samsung fabbed, A4 microprocessor. Although not labeled with ST part numbers, the devices were identified through comparison with devices previously analyzed by Chipworks. The AKM8975 electronic compass is found on the other side of the main board, adjacent to a Samsung flash memory chip, as seen in Figure 2.  These three motion sensors appear to independently provide signals to the A4 microprocessor iOS operating system.  These signals are then integrated in software by the apps.
1
Figure 1: iPhone 4 main board top side.
2
Figure 2: iPhone 4 main board bottom side.
iPhone 4 Three-Axis Accelerometer: STMicroelectronics LIS331DLH
The LIS331DLH device comes packaged in a 3 mm x 3 mm x 1 mm thick LGA type package.   It contains two chips, an ASIC and a MEMS.  The MEMS chip incorporates a cap, as can be seen in Figure 3, which sandwiches the micromachined layer in a cavity between the MEMS die and the cap die.  A lead-doped frit glass seal is used to hermetically seal the cavity.  The ASIC die lies on top of the MEMS cap, and is wire bonded to both the MEMS die and the package substrate, which provides interconnection to the outside world.  The stacked, ASIC over MEMS, geometry has been used in all the ST inertial sensors analyzed by Chipworks.  Other vendors use a side-byside geometry, which makes thinning the package to below 1 mm easier, but makes shrinking the linear dimensions more challenging.
3
Figure 3: LIS331DLH package cross section.
Decapsulation of the LGA package, followed by removal of the cap, reveals the structure of the MEMS die, shown in Figure 4.  The C5L12B MEMS die, which has 2008 mask marks, contains a separate sensor for XY and Z linear acceleration.  The die was fabricated with ST's thick epi-poly layer for micro-actuators and accelerometers (THELMA) process, which provides a 2 μm minimum feature size in a two polysilicon surface micromachined MEMS process.
The XY sensor consists of a polysilicon proof mass, which is anchored to the substrate via leaf springs that constrain the proof mass motion to the XY plane.  Interdigitated, parallel-plate capacitors, mounted within the proof mass structure, capacitively sense inertial deflection of the proof mass in the X and Y directions.  The Z-sensor is formed using a top polysilicon plate that is cantilevered on a torsion spring. Inertial deflection of this plate is sensed capacitively by a bottom polysilicon plate.
4
Figure 4: LIS331DLH C4L12B three-axis accelerometer MEMS die.
A detailed view of the XY sensor, showing a portion of the proof mass supported by the leaf spring, is given in Figure 5.  The interdigitated capacitor plates are attached alternately to the proof mass and to fixed anchors to the substrate.
5
Figure 5:  LIS331DLH XY sensor detail.
The ASIC controller processes signals from the MEMS structure. It is likely that the ASIC actually uses capacitive feedback to maintain a DC bias on the plates, such that the capacitance (plate spacing) remains constant.  This DC bias would be the output signal.  The ASIC delivers a digital I2C/SPI serial interface standard output to the A4 processor.

Basic iOS 5 iPad Animation using Core Animation

The majority of the visual effects used throughout the iOS 5 user interface on the iPad are performed using Core Animation. Core Animation provides a simple mechanism for implementing basic animation within an iPad application. If you need a user interface element to gently fade in or out of view, slide smoothly across the screen or gracefully resize or rotate before the user’s eyes, these effects can be achieved using Core Animation in just a few lines of code.
In this chapter we will provide an overview of the basics of Core Animation and work through a simple example. While much can be achieved with Core Animation, however, it should be noted that if you plan to develop a graphics intensive 3D style application then it is more likely that OpenGL ES will need to be used, a subject area to which numerous books are dedicated.





UIView Core Animation Blocks

The concept of Core Animation involves the implementation of so-called animation blocks. Animation blocks are used to mark the beginning and end of a sequence of changes to the appearance of a UIView and its corresponding subviews. Once the end of the block is reached the animation is committed and the changes are performed over a specified duration. For the sake of example, consider a UIView object that contains a UIButton connected to an outlet named theButton. The application requires that the button gradually fade from view over a period of 3 seconds. This can be achieved by making the button transparent through the use of the alpha property:
theButton.alpha = 0;
Simply setting the alpha property to 0, however, causes the button to immediately become transparent. In order to make it fade out of sight gradually we need to place this line of code in an animation block. The start of an animation block is represented by a call to the beginAnimations class method of the UIView class:
[UIView beginAnimations:nil context:nil];
The end of the animation block triggers the animation sequence through a call to the commitAnimations method:
[UIView commitAnimations];
A variety of properties may also be defined within the animation block. For example, the duration of the animation (in our hypothetical example this needs to be 3 seconds) can be declared by a call to the setAnimationDuration class method:
[UIView setAnimationDuration:3];
Bringing this all together gives us a code sequence to gradually fade out a button object over a period of 3 seconds:
[UIView beginAnimations:nil context:nil];
[UIView setAnimationDuration:3];
theButton.alpha = 0;
[UIView commitAnimations];

Understanding Animation Curves

In addition to specifying the duration of an animation sequence, the linearity of the animation timeline may also be defined by calling the UIView setAnimationCurve class method. This setting controls whether the animation is performed at a constant speed, whether it starts out slow and speeds up and so on. There are currently four possible animation curve settings:
  • UIViewAnimationCurveLinear – The animation is performed at constant speed for the specified duration.
  • UIViewAnimationCurveEaseOut – The animation starts out fast and slows as the end of the sequence approaches
  • UIViewAnimationCurveEaseIn – The animation sequence starts out slow and speeds up as the end approaches.
  • UIViewAnimationCurveEaseInOut – The animation starts slow, speeds up and then slows down again.

Receiving Notification of Animation Completion

Once an animation sequence has been committed and is underway it may be necessary to receive notification when the animation is completed so that the application code can, for example, trigger another animation sequence. The UIView setAnimationDidStopSelector class method allows a method to be specified that will be called when the animation sequence is completed. For example, the following code fragment declares that the method named animationFinished is to be called at the end of the animation sequence:
[UIView setAnimationDidStopSelector:
    @selector(animationFinished:finished:context:)];
The animationFinished method would subsequently be declared as follows:
-(void)animationFinished:(NSString *)animationID
finished:(NSNumber *)finished
context:(void *)context
{
 // Code to be executed on completion of animation sequence
}

Performing Affine Transformations

Transformations allow changes to be made to the coordinate system of a screen area. This essentially allows the programmer to rotate, resize and translate a UIView object. A call is made to one of a number transformation functions and the result assigned to the transform property of the UIView object.
For example, to change the scale of a UIView object named myView by a factor of 2 in both height and width:
myView.transform = CGAffineTransformMakeScale(2, 2);
Similarly, the UIView object may be rotated using the CGAffineTransformMakeRotation which takes as an argument the angle (in radians) by which the view is to be rotated. The following code, for example, rotates a view by 90 degrees:
myView.transform = CGAffineTransformMakeRotation( 90 * M_PI  / 180);
The key point to keep in mind with transformations is that they become animated effects when performed within an animation block. The transformations evolve over the duration of the animation and follow the specified animation curve in terms of timing.

Combining Transformations

Two transformations may be combined to create a single transformation effect via a call to the CGAffineTransformConcat() function. This function takes as arguments the two transformation objects that are to be combined. The result may then be assigned to the transform property of the UIView object to be transformed. The following code fragment, for example, both scales and rotates a UIView object named myView:
CGAffineTransform scaleTrans = 
      CGAffineTransformMakeScale(2, 2);
CGAffineTransform rotateTrans = 
      CGAffineTransformMakeRotation(angle * M_PI / 180);
myView.transform = CGAffineTransformConcat(scaleTrans, rotateTrans);
Affine transformations offer an extremely powerful and flexible mechanism for creating animations and it is just not possible to do justice to these capabilities in a single chapter. In order to learn more about affine transformations, a good starting place is the Transforms chapter of Apple’s Quartz 2D Programming Guide.

Creating the Animation Example Application

The remainder of this chapter is dedicated to the creation of an iPad application intended to demonstrate the use of Core Animation. The end result is a simple application on which a blue square appears. When the user touches a location on the screen the box moves to that location. Through the use of affine transformations, the box will rotate 180 degrees as it moves to the new location whilst also changing in size.
Begin by launching Xcode and creating a new Single View Application with both product and class prefix named animate.

Implementing the Interface File

For the purposes of this application we will need a UIView to represent the blue square and variables to contain the rotation angle and scale factor by which the square will be transformed. These need to be declared in the animateViewController.h file as follows:
#import <UIKit/UIKit.h>

@interface animateViewController : UIViewController

@property (nonatomic) float   scaleFactor;
@property (nonatomic) float   angle;
@property (strong, nonatomic) UIView *boxView;
@end

Drawing in the UIView

Having declared the UIView reference we now need to initialize an instance object and draw a blue square located at a specific location on the screen. We also need to initialize our scaleFactor and angle variables and add boxView as a subview of the application’s main view object. These tasks only need to be performed once when the application first starts up so a good option is to override the loadView method in the animateViewController.m file. Note that in addition to adding this method we also need to synthesize access to the previously declared properties:
#import "animateViewController.h"

@interface animateViewController ()

@end

@implementation animateViewController
@synthesize boxView, scaleFactor, angle;
.
.
// Implement loadView to create a view hierarchy programmatically, without using a nib.
- (void)loadView {
 [super loadView];
        scaleFactor = 2;
        angle = 180;
        CGRect frameRect = CGRectMake(10, 10, 100, 100);
        boxView = [[UIView alloc] initWithFrame:frameRect];
        boxView.backgroundColor = [UIColor blueColor];
        [self.view addSubview:boxView];
}
.
.
@end

Detecting Screen Touches and Performing the Animation

When the user touches the screen the blue box needs to move from its current location to the location of the touch. During this motion, the box will rotate 180 degrees and change in size. The detection of screen touches was covered in detail in An Overview of iOS 5 iPad Multitouch, Taps and Gestures. For the purposes of this example we want to initiate the animation at the point that the user’s finger is lifted from the screen so we need to implement the touchesEnded method in the animateViewController.m file:
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
   UITouch *touch = [touches anyObject];
   CGPoint location = [touch locationInView:self.view];
   [UIView beginAnimations:nil context:nil];
   [UIView setAnimationDelegate:self];
   [UIView setAnimationDuration:2];
   [UIView setAnimationCurve:UIViewAnimationCurveEaseInOut];
   CGAffineTransform scaleTrans =
     CGAffineTransformMakeScale(scaleFactor, scaleFactor);
   CGAffineTransform rotateTrans = 
     CGAffineTransformMakeRotation(angle * M_PI / 180);
   boxView.transform = CGAffineTransformConcat(scaleTrans, rotateTrans);
   angle = (angle == 180 ? 360 : 180);
   scaleFactor = (scaleFactor == 2 ? 1 : 2);
   boxView.center = location;
   [UIView commitAnimations];
}
Before compiling and running the application we need to take some time to describe the actions performed in the above method. First, the method gets the UITouch object from the touches argument and the locationInView method of this object is called to identify the location on the screen where the touch took place:
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:self.view];
The animation block is then started and the current class declared as the delegate. The duration of the animation is set to 2 seconds and curve set to ease in/ease out:
[UIView beginAnimations:nil context:nil];
[UIView setAnimationDelegate:self];
[UIView setAnimationDuration:2];
[UIView setAnimationCurve:UIViewAnimationCurveEaseInOut];
Two transformations are then generated for the view, one to scale the size of the view and one to rotate it 180 degrees. These transformations are then combined into a single transformation and applied to the UIView object:
CGAffineTransform scaleTrans =
  CGAffineTransformMakeScale(scaleFactor, scaleFactor);
CGAffineTransform rotateTrans = 
  CGAffineTransformMakeRotation(angle * M_PI / 180);
boxView.transform = CGAffineTransformConcat(scaleTrans, rotateTrans);
Ternary operators are then used to switch the scale and rotation angle variables ready for the next touch. In other words, after rotating 180 degrees on the first touch the view will need to be rotated to 360 degrees on the next animation. Similarly, once the box has been scaled by a factor of 2 it needs to scale back to its original size on the next animation:
angle = (angle == 180 ? 360 : 180);
scaleFactor = (scaleFactor == 2 ? 1 : 2);
Finally, the location of the view is moved to the point on the screen where the touch occurred before the animation is committed:
boxView.center = location;
[UIView commitAnimations];
Once the touchesEnded method has been implemented it is time to try out the application.

Building and Running the Animation Application

Once the all the code changes have been made and saved, click on the Run button in the Xcode toolbar. Once the application has compiled it will load into the iOS Simulator (refer to Testing iOS 5 Apps on the iPad – Developer Certificates and Provisioning Profiles for steps on how to run the application on an iPad device).
When the application loads the blue square should appear near the top left hand corner of the screen. Click (or touch if running on a device) the screen and watch the box glide and rotate to the new location, the size of the box changing as it moves:

An exmaple iPad iOS 5 core animation app running
Figure 49-1

Summary

Core Animation provides an easy to implement interface to animation within iOS 5 iPad applications. From the simplest of tasks such as gracefully fading out a user interface element to basic animation and transformations, Core Animation provides a variety of techniques for enhancing user interfaces. This chapter covered the basics of Core Animation before working step-by-step through an example to demonstrate the implementation of motion, rotation and scaling animation.

Writing Tests for RubyMotion Apps

1. Getting Started

RubyMotion integrates Bacon, a small clone of the popular RSpec framework written by Christian Neukirchen.
More specifically, RubyMotion uses a version of Bacon called MacBacon which has been extended to integrate well with iOS. MacBacon is maintained by Eloy Duran.

1.1. Spec Files

Spec files are responsible to contain the tests of your project.
Spec files are created under the spec directory of a RubyMotion project.
By default, a RubyMotion project has a spec/main_spec.rb file which contains a single test that ensures that the application has a window.

1.2. Spec Helpers

Spec helpers can be used to extend the testing framework, for instance by introducing a common set of classes or methods that will be used by the spec files. Spec helpers will be compiled and executed before the actual spec files.
Spec helpers are created under the spec/helpers directory of a RubyMotion project. An example could be spec/helpers/extension.rb.
By default, a RubyMotion project has no spec helper.

1.3. Running the Tests

The spec Rake task can be used to run the test suite of a RubyMotion project.
$ rake spec
This command compiles a special version of your app that includes the spec framework, helpers and files and executes it in the simulator in the background.
Once the specs are performed, the program yields back to the command-line prompt with a proper status code (0 in case of success, 1 otherwise).

1.4. Run Selected Spec Files

Sometimes you may not want to run the entire test suite but only one or more isolated spec files.
The files environment variable can be set to a series of comma-delimited patterns in order to filter the spec files that should be executed. Patterns can be either the basename of a spec file (without the file extension) or its path.
As an example, the following command will only run the spec/foo_spec.rb and spec/bar_spec.rb files.
$ rake spec files=foo_spec,spec/bar_spec.rb

1.5. Output Format

It is possible to customize the output format of rake spec by specifying a value for the output environment variable. Possible output formats are: spec_dox (default), fast, test_unit, tap and knock.
$ rake spec output=test_unit

2. Basic Testing

You can refer to MacBacon’s README file for a list of assertions and core predicates that the framework supports.

3. Views and Controllers Testing

This layer lets you write functional tests for your controllers and interact with its views through a set of high-level event generating APIs, by leveraging the functionality of Apple’s UIAutomation framework without forcing you to write the tests in Javascript.
This consists of a small API available to your specifications, some runloop helpers, and a couple of UIView extensions.
Important This is not meant for full application acceptance tests. Therefore you should not let the application launch as normal. This can, for instance, be done by using the RUBYMOTION_ENV to return early from application:didFinishLaunchingWithOptions::
class AppDelegate
  def application(application, didFinishLaunchingWithOptions:launchOptions)
    return true if RUBYMOTION_ENV == 'test'
    # ...

3.1. Configuring your Context

You need to tell the specification context which controller will be specified and extend it with the required API. You do this by specifying your view controller class in the following way:
describe "The 'taking over the world' view" do
  tests TakingOverTheWorldViewController

  # Add your specifications here.
end
This will, before each specification, instantiate a new window and a new instance of your view controller class. These are available in your specifications as window and controller.
Tip If you need to perform custom instantiation of either the window or controller then you can do so from a before filter before calling tests.

3.2. Durations

Some methods take a :duration option which specifies the period of time, in seconds, during which events will be generated. This is always optional.
Tip The default duration value can be changed through Bacon::Functional.default_duration=.

3.3. Device Events

These methods generate events that operate on the device level. As such, they don’t take an accessibility label or specific view.

3.3.1. rotate_device

Rotates the device to the specified orientation.
rotate_device(:to => orientation, :button => location)
  • to: The orientation to rotate the device to, which can be either :portrait or :landscape.
  • button: Used to indicate a specific portrait/landscape orientation which can be either :bottom or :top in portrait mode, or either :left or :right in landscape mode. If omitted, it will default to :bottom in portrait mode and :left in landscape mode.
The following example rotates the device to the landscape orientation with the home button on the left-hand side of the device:
rotate_device :to => :landscape
Or to have the button on the right-hand side of the device:
rotate_device :to => :landscape, :button => :right

3.3.2. accelerate

Generates accelerometer events.
accelerate(:x => x_axis_acceleration, :y => y_axis_acceleration,
           :z => z_axis_acceleration, :duration => duration)
  • x: With the device held in portrait orientation and the screen facing you, the x axis runs from left (negative values) to right (positive values) across the face of the device.
  • y: With the device held in portrait orientation and the screen facing you, the y axis runs from bottom (negative values) to top (positive values) across the face of the device.
  • z: With the device held in portrait orientation and the screen facing you, the z axis runs from back (negative values) to front (positive values) through the device.
This will simulate a device laying still on its back:
accelerate :x => 0, :y => 0, :z => -1

3.3.3. shake

Essentially generates accelerometer events.
shake()
Use this when you want to specifically trigger a shake motion event.
For more information see the event handling guide.

3.4. Finding Views

These methods allow you to retrieve views. They traverse down through the view hierarchy, starting from the current window.
If no view matches, then they will keep re-trying it during the timeout, which defaults to three seconds. This means you don’t need to worry about wether or not the view you’re looking for is still being loaded or animated.
Finally, if the timeout passes and no view matches an exception will be raised.
Tip The default timeout value can be changed through Bacon::Functional.default_timeout=.

3.4.1. view

Returns the view that matches the specified accessibility label.
view(label)
Example:
button = UIButton.buttonWithType(UIButtonTypeRoundedRect)
button.setTitle('Take over the world', forState:UIControlStateNormal)
window.addSubview(button)

view('Take over the world') # => button
Tip See UIView#viewByName(accessibilityLabel, timeout).

3.4.2. views

Returns an array of all the views that match the given class.
views(view_class)
Example:
button1 = UIButton.buttonWithType(UIButtonTypeRoundedRect)
button1.setTitle('Take over the world', forState:UIControlStateNormal)
window.addSubview(button1)

button2 = UIButton.buttonWithType(UIButtonTypeRoundedRect)
button2.setTitle('But not tonight', forState:UIControlStateNormal)
window.addSubview(button2)

views(UIButton) # => [button1, button2]
Tip See UIView#viewsByClass(viewClass, timeout).

3.5. View Events

These methods all operate on views. You specify the view to operate on by its ‘accessibility label’ or pass in a view instance.
Note In general all the UIKit controls will have decent default values for their accessibility labels. E.g. a UIButton with title “Take over the world” will have the same value for its accessibility label. If, however, you have custom views, or otherwise need to override the default, then you can do so by setting its accessibilityLabel attribute.
Wherever a ‘location’ is required you can either specify a CGPoint instance or use one of the following constants:
  • :top_left
  • :top
  • :top_right
  • :right
  • :bottom_right
  • :bottom
  • :bottom_left
  • :left
Note CGPoint instances have to be specified in window coordinates.
Tip Some of the methods take a :from location and a :to location option. If only :from or :to is specified and with a location constant, then the other option can be omitted and will default to the opposite of the specified location. If, however, a CGPoint instance is used, then the other option has to be specified as well.

3.5.1. tap

Generates events that simulate tapping a view.
tap(label_or_view, :at => location, :times => number_of_taps, :touches => number_of_fingers)
All of these options are optional:
  • at: The location where the tap will occur. Defaults to the center of the view.
  • times: The number of times to tap the view. Defaults to a single tap.
  • touches: The number of fingers used to tap the view. Defaults to a single touch.
Tapping a view once only requires:
button = UIButton.buttonWithType(UIButtonTypeRoundedRect)
button.setTitle('Take over the world', forState:UIControlStateNormal)
window.addSubview(button)

tap 'Take over the world'
Tapping a view twice with two fingers requires you to specify those options:
view = UIView.alloc.initWithFrame(CGRectMake(0, 0, 100, 100))
view.accessibilityLabel = 'tappable view'
recognizer = UITapGestureRecognizer.alloc.initWithTarget(self, action:'handleTap:')
recognizer.numberOfTapsRequired = 2
recognizer.numberOfTouchesRequired = 2
view.addGestureRecognizer(recognizer)

tap 'tappable view', :times => 2, :touches => 2

3.5.2. flick

Generates a short fast drag gesture.
flick(label_or_view, :from => location, :to => location, :duration => duration)
  • from: The location where the drag will start.
  • to: The location where the drag will end.
Flicking a switch would be done like so:
switch = UISwitch.alloc.initWithFrame(CGRectMake(0, 0, 100, 100))
switch.accessibilityLabel = 'Enable rainbow theme'
window.addSubview(switch)

flick 'Enable rainbow theme', :to => :right

3.5.3. pinch_open

Generates an opening pinch gesture.
pinch_open(label_or_view, :from => location, :to => location, :duration => duration)
  • from: The location where both fingers are at the start of the gesture. Defaults to :left.
  • to: The location where the moving finger will be at the end of the gesture. Defaults to :right.
The following zooms in on the content view of a UIScrollView:
view('Zooming scrollview').zoomScale # => 1.0
pinch_open 'Zooming scrollview'
view('Zooming scrollview').zoomScale # => 2.0

3.5.4. pinch_close

Generates a closing pinch gesture.
pinch_close(label_or_view, :from => location, :to => location, :duration => duration)
  • from: The location where the moving finger will be at the start of the gesture. Defaults to :right.
  • to: The location where both fingers are at the end of the gesture. Defaults to :left.
The following zooms out of the content view of a UIScrollView:
view('Zooming scrollview').zoomScale # => 1.0
pinch_close 'Zooming scrollview'
view('Zooming scrollview').zoomScale # => 0.5

3.5.5. drag

Generates a drag gesture (i.e. panning, scrolling) over a path interpolated between the start and end location.
drag(label_or_view, :from => location, :to => location, :number_of_points => steps,
     :points => path, :touches => number_of_fingers, :duration => duration)
  • from: The location where the drag will start. Not used if :points is specified.
  • to: The location where the drag will end. Not used if :points is specified.
  • number_of_points: The number of points along the path that is interpolated between :from and :to. Defaults to 20. Not used if :points is specified.
  • points: An array of CGPoint instances that specify the drag path.
  • touches: The number of fingers used to drag. Defaults to a single touch.
Note Keep in mind that scrolling into a direction means dragging into the opposite direction.
The following will scroll down in a scroll view:
view('Scrollable scrollview').contentOffset.y # => 0
drag 'Scrollable scrollview', :from => :bottom
view('Scrollable scrollview').contentOffset.y # => 400

3.5.6. rotate

Generates a clockwise rotation gesture around the center point of the view.
rotate(label_or_view, :radians => angle, :degrees => angle, :touches => number_of_fingers,
       :duration => duration)
  • radians: The angle of the rotation in radians. Defaults to Ï€.
  • degrees: The angle of the rotation in degrees. Defaults to 180.
  • touches: The number of fingers used to rotate. Defaults to 2.

What is a Gyroscope?

What is a Gyroscope?

 See how a simple gyroscope begins to behave differently when the inner wheel is spun. That  shows the unique consequences of angular momentum, something that allows the gyroscope to resist force when you try to tilt it and also enables it to stand on its end without falling over while the inner wheel remains spinning. Gyroscopes play important roles in aircrafts, ships, spacecraft and robots as well as other machines and devices.


A gyroscope


A gyroscope in operation with freedom in all three axes. The rotor will maintain its spin axis direction regardless of the orientation of the outer frame.