Tuesday 29 December 2015

Sending & Receiving Push Notifications in Ionic Apps

https://thinkster.io/ionic-push-notifications-tutorial

Use The Android And iOS Camera With Ionic Framework

Most smart phones on the market have at least one camera if not two.  You may want to leverage these cameras to make the next Instagram or SnapChat type application.  Lucky for us, the native Android and iOS camera can be easily accessed using Ionic Framework and the AngularJS extension set, ngCordova.
The following will help you add camera functionality into your latest creation.

Let’s start by creating a new Ionic project with the Android and iOS platforms:

Remember that you must be using a Mac if you wish to build for iOS.
The next thing we want to do is add the Apache Cordova camera plugin.  This can be done by running the following:
For this tutorial we are going to be using the AngularJS extension set for Apache Cordova called ngCordova.  Start by downloading the latest release of ngCordova and placing the ng-cordova.min.js file in your project’s www/js directory.
Next we need to include this file in our project’s code.  Open the index.html file and include the script before the cordova.js line like below:
This will include the library into our project, but now we need to include it for use in AngularJS.  Open your app.js file and alter the angular.module line to look like the following:
Now we’re ready for the fun stuff.  Inside your app.js file, we need to add a controller with a method that will launch the camera.  The following was copied almost exactly from the ngCordova documentation:
Notice on line 17, I added a bunch of other data to my scope.  Because our destination type is DATA_URL we will be returning raw camera data rather than a file.  By adding data:image/jpeg;base64 we can use an HTML img tag to display our freshly created snapshot.
Just like that, you can call takePicture() from your UI and initialize the native Android and iOS camera in Ionic Framework.

Implement Google Maps Using Ionic Framework

he following will get you a fresh Ionic project with geolocation and a fancy Google Map.

Like with most of my Ionic tutorials, let’s start by creating a new project with the iOS and Android platforms added: 

Create new Ionic Android and iOS project
Remember, if you’re not on a Mac, you cannot build for iOS.
Since we are using maps, it is probably a good idea to add geolocation functionality to your application.  You can add the Apache Cordova Geolocation API by running the following command:
We are going to be using the Google Maps JavaScript SDK, which requires us to have an API key for use in our application.  Go into your Google API Console and register a new Google Maps application.
When you have your key, crack open your www/index.html file because we need to add the JavaScript library to our project.
Please note that the Google Maps JavaScript library cannot be downloaded, so there will always be a small delay during the initial setup when you launch your application.
Now that the SDK is included we need to add some custom CSS for displaying it on the screen.  Open your www/css/style.css file and add the following code:
Next we get to start doing the fun stuff.  Open your www/app.js file because we want to add a controller that handles the Google Map.
The above code will register the map to a DOM element with a map id.  It will center the map in Merced, California and then attempt to center the map around your current location.  If it finds your current location it will place a marker.
We are not quite done yet.  We need to add some code to our www/index.html file for containing the map.
Now, the depth of Google Maps can go far beyond the simplistic nature I demonstrated in my article.  Have some fun with the official API docs and use Google Maps to its full power.

Monday 7 December 2015

What are the performance characteristics of sqlite with very large database files?

So I did some tests with sqlite for very large files, and came to some conclusions (at least for my specific application).
The tests involve a single sqlite file with either a single table, or multiple tables. Each table had about 8 columns, almost all integers, and 4 indices.
The idea was to insert enough data until sqlite files were about 50GB.
Single Table
I tried to insert multiple rows into a sqlite file with just one table. When the file was about 7GB (sorry I can't be specific about row counts) insertions were taking far too long. I had estimated that my test to insert all my data would take 24 hours or so, but it did not complete even after 48 hours.
This leads me to conclude that a single, very large sqlite table will have issues with insertions, and probably other operations as well.
I guess this is no surprise, as the table gets larger, inserting and updating all the indices take longer.
Multiple Tables
I then tried splitting the data by time over several tables, one table per day. The data for the original 1 table was split to ~700 tables.
This setup had no problems with the insertion, it did not take longer as time progressed, since a new table was created for every day.
Vacuum Issues
As pointed out by i_like_caffeine, the VACUUM command is a problem the larger the sqlite file is. As more inserts/deletes are done, the fragmentation of the file on disk will get worse, so the goal is to periodically VACUUM to optimize the file and recover file space.
However, as pointed out by documentation, a full copy of the database is made to do a vacuum, taking a very long time to complete. So, the smaller the database, the faster this operation will finish.
Conclusions
For my specific application, I'll probably be splitting out data over several db files, one per day, to get the best of both vacuum performance and insertion/delete speed.
This complicates queries, but for me, it's a worthwhile tradeoff to be able to index this much data. An additional advantage is that I can just delete a whole db file to drop a day's worth of data (a common operation for my application).
I'd probably have to monitor table size per file as well to see when the speed will become a problem.
It's too bad that there doesn't seem to be an incremental vacuum method other than auto vacuum. I can't use it because my goal for vacuum is to defragment the file (file space isn't a big deal), which auto vacuum does not do. In fact, documentation states it may make fragmentation worse, so I have to resort to periodically doing a full vacuum on the file.


 http://stackoverflow.com/questions/784173/what-are-the-performance-characteristics-of-sqlite-with-very-large-database-file

Improve INSERT-per-second performance of SQLite?

Several tips:
  1. Put inserts/updates in a transaction.
  2. For older versions of SQLite - Consider a less paranoid journal mode (pragma journal_mode). There is NORMAL, and then there OFF which can significantly increase insert speed if you're not too worried about the database possibly getting corrupted if the OS crashes. If your application crashes the data should be fine. Note that in newer versions, the OFF/MEMORY settings are not safe for application level crashes.
  3. Playing with page sizes makes a difference as well (PRAGMA page_size). Having larger page sizes can make reads and writes go a bit faster as larger pages are held in memory. Note that more memory will be used for your database.
  4. If you have indices, consider calling CREATE INDEX after doing all your inserts. This is significantly faster than creating the index and then doing your inserts.
  5. You have to be quite careful if you have concurrent access to SQLite, as the whole database is locked when writes are done, and although multiple readers are possible, writes will be locked out. This has been improved somewhat with the addition of a WAL in newer SQLite versions.
  6. Take advantage of saving space...smaller databases go faster. For instance, if you have key value pairs, try making the key an INTEGER PRIMARY KEY if possible, which will replace the implied unique row number column in the table.
  7. If you are using multiple threads, you can try using the shared page cache, which will allow loaded pages to be shared between threads, which can avoid expensive I/O calls.
I've also asked similar questions here and here.

Swift performance: sorting arrays

Swift is now as fast as C by this benchmark using the default release optimisation level [-O].
Here is an in-place quicksort in Swift:
func quicksort_swift(inout a:CInt[], start:Int, end:Int) {
    if (end - start < 2){
        return
    }
    var p = a[start + (end - start)/2]
    var l = start
    var r = end - 1
    while (l <= r){
        if (a[l] < p){
            l += 1
            continue
        }
        if (a[r] > p){
            r -= 1
            continue
        }
        var t = a[l]
        a[l] = a[r]
        a[r] = t
        l += 1
        r -= 1
    }
    quicksort_swift(&a, start, r + 1)
    quicksort_swift(&a, r + 1, end)
}
 
 
 http://stackoverflow.com/questions/24101718/swift-performance-sorting-arrays?rq=1

How can I cast an NSMutableArray to a Swift array of a specific type?

You can make this work with a double downcast, first to NSArray, then to [String]:
var strings = myObjcObject.getStrings() as NSArray as [String]
Tested in a Playground with:
import Foundation

var objCMutableArray = NSMutableArray(array: ["a", "b", "c"])
var swiftArray = objCMutableArray as NSArray as [String]
Update:
In later versions of Swift (at least 1.2), the compiler will complain about as [String]. Instead you should use an if let with a conditional downcast as?:
import Foundation

var objCMutableArray = NSMutableArray(array: ["a", "b", "c"])
if let swiftArray = objCMutableArray as NSArray as? [String] {
    // Use swiftArray here
}
If you are absolutely sure that your NSMutableArray can be cast to [String], then you can use as! instead (but you probably shouldn't use this in most cases):
import Foundation

var objCMutableArray = NSMutableArray(array: ["a", "b", "c"])
var swiftArray = objCMutableArray as NSArray as! [String]