How should I integrate video and photo functionality in an iOS app? [closed]
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ height:90px;width:728px;box-sizing:border-box;
}
I am working on a basic camera app. I have made the video part and photo part, and I now need to integrate. Should I integrate them both into the same class/file? Or should I spread them out into two separate VCs (ViewControllers)?
How would that work exactly?
ios iphone swift
closed as too broad by Peter Mortensen, Cody Gray♦ Apr 2 at 2:20
Please edit the question to limit it to a specific problem with enough detail to identify an adequate answer. Avoid asking multiple distinct questions at once. See the How to Ask page for help clarifying this question. If this question can be reworded to fit the rules in the help center, please edit the question.
add a comment |
I am working on a basic camera app. I have made the video part and photo part, and I now need to integrate. Should I integrate them both into the same class/file? Or should I spread them out into two separate VCs (ViewControllers)?
How would that work exactly?
ios iphone swift
closed as too broad by Peter Mortensen, Cody Gray♦ Apr 2 at 2:20
Please edit the question to limit it to a specific problem with enough detail to identify an adequate answer. Avoid asking multiple distinct questions at once. See the How to Ask page for help clarifying this question. If this question can be reworded to fit the rules in the help center, please edit the question.
add a comment |
I am working on a basic camera app. I have made the video part and photo part, and I now need to integrate. Should I integrate them both into the same class/file? Or should I spread them out into two separate VCs (ViewControllers)?
How would that work exactly?
ios iphone swift
I am working on a basic camera app. I have made the video part and photo part, and I now need to integrate. Should I integrate them both into the same class/file? Or should I spread them out into two separate VCs (ViewControllers)?
How would that work exactly?
ios iphone swift
ios iphone swift
edited Apr 2 at 2:18
Paul White
4,12323261
4,12323261
asked Nov 24 '18 at 3:56
Q The GreatQ The Great
61119
61119
closed as too broad by Peter Mortensen, Cody Gray♦ Apr 2 at 2:20
Please edit the question to limit it to a specific problem with enough detail to identify an adequate answer. Avoid asking multiple distinct questions at once. See the How to Ask page for help clarifying this question. If this question can be reworded to fit the rules in the help center, please edit the question.
closed as too broad by Peter Mortensen, Cody Gray♦ Apr 2 at 2:20
Please edit the question to limit it to a specific problem with enough detail to identify an adequate answer. Avoid asking multiple distinct questions at once. See the How to Ask page for help clarifying this question. If this question can be reworded to fit the rules in the help center, please edit the question.
add a comment |
add a comment |
3 Answers
3
active
oldest
votes
The easiest way of doing this (and the way I did it) is to build out the camera functionality and then the video capturing functionality in 2 separate classes. Once you do this you will find that many required protocols and functions will overlap. From this point it is quite simple, all you have to do is copy paste from the video class into the photo. The photo has more components to it that's why you should start there. For example: setupDevice()
and setupInputOutput()
should overlap. So in conclusion: yes I recommend integrating into one class and setting up 2 extensions: one for the AVFoundation Video protocols and the same for Photo.
Quick tips: set up a photoOutput
and movieOutput
, furthermore the capture session
and previewLayer
should work for both and thus there should only be one of each. I as well built my cam in Snapchat-esque fashion.
Tell me if you need extra guidance.
add a comment |
if UIImagePickerController.isSourceTypeAvailable(UIImagePickerControllerSourceType.camera) {
let img = UIImagePickerController()
img.delegate = self
img.sourceType = UIImagePickerControllerSourceType.camera;
img.mediaTypes = [kUTTypeImage as String]; //whatever u want type
img.allowsEditing = true
rootNavigationController?.present(img, animated: true, completion: nil)
}
1
please add more details
– Tilo
Nov 24 '18 at 7:49
add a comment |
You can use UIImagePickerController for both videos and images.
lazy var picker: UIImagePickerController = {
let pickerView = UIImagePickerController()
pickerView.mediaTypes = [kUTTypeMovie as String, kUTTypeImage as String]
pickerView.sourceType = .camera
pickerView.videoQuality = .type640x480
return pickerView
}()
I have done the below:
- Record video and save it to Album. Also send it to the server.
- Capture photo and send it to the server.
add a comment |
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
The easiest way of doing this (and the way I did it) is to build out the camera functionality and then the video capturing functionality in 2 separate classes. Once you do this you will find that many required protocols and functions will overlap. From this point it is quite simple, all you have to do is copy paste from the video class into the photo. The photo has more components to it that's why you should start there. For example: setupDevice()
and setupInputOutput()
should overlap. So in conclusion: yes I recommend integrating into one class and setting up 2 extensions: one for the AVFoundation Video protocols and the same for Photo.
Quick tips: set up a photoOutput
and movieOutput
, furthermore the capture session
and previewLayer
should work for both and thus there should only be one of each. I as well built my cam in Snapchat-esque fashion.
Tell me if you need extra guidance.
add a comment |
The easiest way of doing this (and the way I did it) is to build out the camera functionality and then the video capturing functionality in 2 separate classes. Once you do this you will find that many required protocols and functions will overlap. From this point it is quite simple, all you have to do is copy paste from the video class into the photo. The photo has more components to it that's why you should start there. For example: setupDevice()
and setupInputOutput()
should overlap. So in conclusion: yes I recommend integrating into one class and setting up 2 extensions: one for the AVFoundation Video protocols and the same for Photo.
Quick tips: set up a photoOutput
and movieOutput
, furthermore the capture session
and previewLayer
should work for both and thus there should only be one of each. I as well built my cam in Snapchat-esque fashion.
Tell me if you need extra guidance.
add a comment |
The easiest way of doing this (and the way I did it) is to build out the camera functionality and then the video capturing functionality in 2 separate classes. Once you do this you will find that many required protocols and functions will overlap. From this point it is quite simple, all you have to do is copy paste from the video class into the photo. The photo has more components to it that's why you should start there. For example: setupDevice()
and setupInputOutput()
should overlap. So in conclusion: yes I recommend integrating into one class and setting up 2 extensions: one for the AVFoundation Video protocols and the same for Photo.
Quick tips: set up a photoOutput
and movieOutput
, furthermore the capture session
and previewLayer
should work for both and thus there should only be one of each. I as well built my cam in Snapchat-esque fashion.
Tell me if you need extra guidance.
The easiest way of doing this (and the way I did it) is to build out the camera functionality and then the video capturing functionality in 2 separate classes. Once you do this you will find that many required protocols and functions will overlap. From this point it is quite simple, all you have to do is copy paste from the video class into the photo. The photo has more components to it that's why you should start there. For example: setupDevice()
and setupInputOutput()
should overlap. So in conclusion: yes I recommend integrating into one class and setting up 2 extensions: one for the AVFoundation Video protocols and the same for Photo.
Quick tips: set up a photoOutput
and movieOutput
, furthermore the capture session
and previewLayer
should work for both and thus there should only be one of each. I as well built my cam in Snapchat-esque fashion.
Tell me if you need extra guidance.
edited Jan 5 at 22:00
answered Jan 5 at 21:59
user10817680
add a comment |
add a comment |
if UIImagePickerController.isSourceTypeAvailable(UIImagePickerControllerSourceType.camera) {
let img = UIImagePickerController()
img.delegate = self
img.sourceType = UIImagePickerControllerSourceType.camera;
img.mediaTypes = [kUTTypeImage as String]; //whatever u want type
img.allowsEditing = true
rootNavigationController?.present(img, animated: true, completion: nil)
}
1
please add more details
– Tilo
Nov 24 '18 at 7:49
add a comment |
if UIImagePickerController.isSourceTypeAvailable(UIImagePickerControllerSourceType.camera) {
let img = UIImagePickerController()
img.delegate = self
img.sourceType = UIImagePickerControllerSourceType.camera;
img.mediaTypes = [kUTTypeImage as String]; //whatever u want type
img.allowsEditing = true
rootNavigationController?.present(img, animated: true, completion: nil)
}
1
please add more details
– Tilo
Nov 24 '18 at 7:49
add a comment |
if UIImagePickerController.isSourceTypeAvailable(UIImagePickerControllerSourceType.camera) {
let img = UIImagePickerController()
img.delegate = self
img.sourceType = UIImagePickerControllerSourceType.camera;
img.mediaTypes = [kUTTypeImage as String]; //whatever u want type
img.allowsEditing = true
rootNavigationController?.present(img, animated: true, completion: nil)
}
if UIImagePickerController.isSourceTypeAvailable(UIImagePickerControllerSourceType.camera) {
let img = UIImagePickerController()
img.delegate = self
img.sourceType = UIImagePickerControllerSourceType.camera;
img.mediaTypes = [kUTTypeImage as String]; //whatever u want type
img.allowsEditing = true
rootNavigationController?.present(img, animated: true, completion: nil)
}
answered Nov 24 '18 at 7:40
Urvashi HirparaUrvashi Hirpara
11
11
1
please add more details
– Tilo
Nov 24 '18 at 7:49
add a comment |
1
please add more details
– Tilo
Nov 24 '18 at 7:49
1
1
please add more details
– Tilo
Nov 24 '18 at 7:49
please add more details
– Tilo
Nov 24 '18 at 7:49
add a comment |
You can use UIImagePickerController for both videos and images.
lazy var picker: UIImagePickerController = {
let pickerView = UIImagePickerController()
pickerView.mediaTypes = [kUTTypeMovie as String, kUTTypeImage as String]
pickerView.sourceType = .camera
pickerView.videoQuality = .type640x480
return pickerView
}()
I have done the below:
- Record video and save it to Album. Also send it to the server.
- Capture photo and send it to the server.
add a comment |
You can use UIImagePickerController for both videos and images.
lazy var picker: UIImagePickerController = {
let pickerView = UIImagePickerController()
pickerView.mediaTypes = [kUTTypeMovie as String, kUTTypeImage as String]
pickerView.sourceType = .camera
pickerView.videoQuality = .type640x480
return pickerView
}()
I have done the below:
- Record video and save it to Album. Also send it to the server.
- Capture photo and send it to the server.
add a comment |
You can use UIImagePickerController for both videos and images.
lazy var picker: UIImagePickerController = {
let pickerView = UIImagePickerController()
pickerView.mediaTypes = [kUTTypeMovie as String, kUTTypeImage as String]
pickerView.sourceType = .camera
pickerView.videoQuality = .type640x480
return pickerView
}()
I have done the below:
- Record video and save it to Album. Also send it to the server.
- Capture photo and send it to the server.
You can use UIImagePickerController for both videos and images.
lazy var picker: UIImagePickerController = {
let pickerView = UIImagePickerController()
pickerView.mediaTypes = [kUTTypeMovie as String, kUTTypeImage as String]
pickerView.sourceType = .camera
pickerView.videoQuality = .type640x480
return pickerView
}()
I have done the below:
- Record video and save it to Album. Also send it to the server.
- Capture photo and send it to the server.
edited Mar 31 at 21:41
Peter Mortensen
13.9k1987114
13.9k1987114
answered Nov 24 '18 at 4:59
iDev750iDev750
6251417
6251417
add a comment |
add a comment |