Git Product home page Git Product logo

ios8-day-by-day's Introduction

iOS8-day-by-day

Apple delivered iOS 8 to the developer world at WWDC in June 2014, before launching it at the wider world in September of the same year. The iOS 8 SDK was somewhat over-shadowed by the simultaneous announcement of a new programming language in the form of Swift, however this didn't mean that the core OS had been overlooked at all. Quite the opposite - a huge number of new APIs had been introduced, powering tons of new functionality.

iOS8: Day-by-Day is a review of the most important of these. Busy developers don't have time to trawl the WWDC videos and Apple documentation. Instead they'd like to get a high-level summary of the new possibilities, alongside some working sample code. This is exactly what iOS8: Day-by-Day provides. It started out as a blog series, and these blog posts now form the basis of the book.

This repo contains all the source code associated with each of the articles, allowing you to see each of the new technologies in action.

Book

iOS8 Day-by-Day is now a published eBook. You can get hold of your copy from Leanpub at leanpub.com/ios8daybyday.

Blog Series

iOS8 Day-by-Day is a blog series which dives into all the new features available for developers in iOS8. You can read more about it and find an index of the currently published posts here at shinobicontrols.com/iOS8DayByDay.

This repo is a collection of projects which accompany the series. Every single blog post has a project which demonstrates the functionality in-action, rather than just being a purely theoretical overview of the technology.

If you have any suggestions / ideas / comments please feel free to catch me on twitter at @iwantmyrealname, and do fork the repo and have a play around with the code!

sam

iwantmyreal.name

@iwantmyrealname @shinobicontrols

ios8-day-by-day's People

Contributors

aaroncrespo avatar kerautret avatar monishsyed avatar samburnstone avatar sammyd avatar tadija avatar zoonooz avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

ios8-day-by-day's Issues

29 - Safari Action

Is there any way to instead of bringing up the menu to click on the elements, to open a Safari web view and have the JS communicate with that window? I have a little bookmarklet that works wonders with your code, but it overrides the parent window. I'd either like to launch the full iOS app to process the URL, or run the JS on that popover dialog in safari.

Is this possible?

10-playgrounds OSX

you need to update the NSView init to propagate its super implementation failure (eg. add ? to init)

Pass in an image for the color of the chroma key filter

I'm trying to use your chroma key filter and right now I have to run two pass on my image, first getting the average color of a small area of the image, then extract the color from the returned image, then pass that color to your chroma filter, then run the chroma filter.

It would be great to be able to chain those two filters. Any idea how i could do that?

compile error in Xcode6.0.1

I pull the code and open the project of 08-today-extension.
Then I got the following compile error (target simulator
error
is iPhone5).

10-playgrounds iOS

Using 6.1 final, the example doesn't work - the animation appears in its initial state and then just pulsates a couple of times.

cloud kit project need an update for Xcode 6.3

1
/iOS8-day-by-day-master/33-cloudkit/CloudNotes/CloudNotes/NoteManager.swift:74:27: Cannot invoke initializer for type 'CKModifyRecordsOperation' with an argument list of type '(recordsToSave: [CloudKitNote], recordIDsToDelete: NilLiteralConvertible)'

2
/iOS8-day-by-day-master/33-cloudkit/CloudNotes/CloudNotes/Note.swift:42:7: Type 'CloudKitNote' does not conform to protocol 'Note'

please help

'dispatch_release' is unavailable

iOS8-day-by-day/13-coreimage-detectors/LiveDetection/LiveDetection/CoreImageVideoFilter.swift:53:5: 'dispatch_release' is unavailable

Commenting it out seemed to get it working for me.

Day 5 Auto-sizing table view cells

Hello!
I've noticed that this example doesn't work on iOS 8.3. See an attachment below.
bug
I checked the issue in emulator and on my iPod, with the same result. On iOS 8.1 and 8.2 everything works as expected.
Any ideas how to fix it?

TouchId not working when app deployed via iTunesConnect

First, I'd like to thank you for the great example. It was very helpful. I have implemented TouchId in my app per your example and it works as expected as long as the deployment is done via cable (local).

However, it stops working if one deploys via iTunesConnect. In that case it does not seem to retrieve anything anymore?

Can somebody please provide some help or insight? Thoughts? Thanks in advance.

coreimagefilter.swift throws error

/iOS8-day-by-day-master/13-coreimage-detectors/LiveDetection/LiveDetection/CoreImageVideoFilter.swift:89:69: Value of type 'OSType' (aka 'UInt32') does not conform to expected dictionary value type 'AnyObject'

Sir, how would I pass the result image from CoreImageVIewFilter to the main view ?

I am current ly implementing the filter in capture Photo delegate as below

When it comes to the implementation, the filter is not working.

    @IBAction func capturePhoto(_ sender: Any) {
        
        // stop text recognition
        cameraSession.stopRunning()
        
        //start filter
        videoFilter = CoreImageVideoFilter(superview: view, applyFilterCallback: nil)
        // Simulate a tap on the mode selector to start the process

        if let videoFilter = videoFilter {
            videoFilter.stopFiltering()
            detector = prepareRectangleDetector()
            videoFilter.applyFilter = {
                image in
                self.resultCIIMage = self.performRectangleDetection(image)!
                return self.performRectangleDetection(image)
            }
            
            videoFilter.startFiltering(currentSession: cameraSession)
        }
        
        
        
        cameraSession.beginConfiguration()
        cameraSession.sessionPreset = AVCaptureSessionPresetPhoto
        let device : AVCaptureDevice = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
        do {
            let captureDeviceInput = try AVCaptureDeviceInput(device: device)
            if cameraSession.canAddInput(captureDeviceInput) {
                cameraSession.addInput(captureDeviceInput)
            }
        }
        catch {
            print("Error occured \(error)")
            return
        }
        if(device.isFocusModeSupported(.continuousAutoFocus)) {
            try! device.lockForConfiguration()
            device.focusMode = .continuousAutoFocus
            device.unlockForConfiguration()
        }
        runStillImageCaptureAnimation()
        cameraSession.addOutput(cameraOutput)
        cameraSession.commitConfiguration()
        
        let photoSettings = AVCapturePhotoSettings()
        photoSettings.flashMode = .on
        photoSettings.isHighResolutionPhotoEnabled = true
        
        if photoSettings.__availablePreviewPhotoPixelFormatTypes.count > 0 {
            photoSettings.previewPhotoFormat = [ kCVPixelBufferPixelFormatTypeKey as String : photoSettings.__availablePreviewPhotoPixelFormatTypes.first!]
            
        }
        
        cameraSession.startRunning()
        print("go this")
        cameraOutput.isHighResolutionCaptureEnabled = true
        cameraOutput.capturePhoto(with: photoSettings, delegate: self)
        //  print("go that")
        //   cameraSession.stopRunning()
    }
    
    
    func runStillImageCaptureAnimation(){
        DispatchQueue.main.async {
            self.preview.layer.opacity = 0.0
            print("opacity 0")
            UIView.animate(withDuration: 1.0) {
                self.preview.layer.opacity = 1.0
                print("opacity 1")
            }
        }
    }
    
    func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {
        
        print("go threee ")
        
        if let error = error {
            print("error occure : \(error.localizedDescription)")
        }
        
        if  let sampleBuffer = photoSampleBuffer,
            let previewBuffer = previewPhotoSampleBuffer,
            let dataImage =  AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer:  sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {
            
            let dataProvider = CGDataProvider(data: dataImage as CFData)
            let cgImageRef: CGImage! = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: .defaultIntent)
            //let imageX = UIImage(cgImage: cgImageRef, scale: 1.0, orientation: UIImageOrientation.right)
            let image = UIImage(cgImage: cgImageRef, scale: 1.0, orientation: UIImageOrientation.right)
            
            capturedImage = videoFilter?.resuItImage == nil ? image : convert(cmage: resultCIIMage)
            user?.setScannedlist(list: self.scannerdText)
            user?.capImage(captured :  capturedImage!  )
            tesseract?.delegate = nil
            tesseract = nil
            
            
            self.dismiss(animated: false, completion: { Void in self.cameraSession.stopRunning()})
            
        } else {
            print("some errorX  here")
        }
    }

preferredLayoutAttributesFittingAttributes issue

Hello
i am using the preferredLayoutAttributesFittingAttributes to auto size the cell in my collection view.
However the top contentinsets of the 1st cell is not respected and it starts from the top without any
margin. However when the datasource consists of only 1 cell then the top content inset is respected.
Do you know of any solutions/workarounds for this matter?

thanks

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.