Commit 3885e1c7 authored by Fabio Pelosin's avatar Fabio Pelosin

Extracted Analyzer class from installer.

- added support for manifest.lock.
- added support for checking specifications checksums.
- removed lazy repo update as the check for the specifications checksums
  requires an up to date repo.

Closes #604.
parent 6998e436
## Branch 0.17
[CocoaPods](https://github.com/CocoaPods/CocoaPods/compare/master...0.17)
[Core](https://github.com/CocoaPods/Core/master)
[Xcodeproj](https://github.com/CocoaPods/Xcodeproj/compare/0.4.0...master)
- TODO: Dropped script for resources.
###### TODO
- Add Rake FileList warning.
- Enable CocoaPods Core-warnings
- Dropped script for resources.
- Added support for `prefix_header_file` in subspecs
- Added support for `prefix_header_contents` in subspecs
- LocalPod needs to be updated for some changes done to the DSL
###### Specification DSL Changes
###### Specification DSL
- Deprecated `header_mappings` hook.
- [__Breaking__] Deprecated `header_mappings` hook.
- [__Breaking__] Deprecated `exclude_header_search_paths`
- [__Breaking__] `requires_arc` is transitioning from `false` to `true`.
- [__Breaking__] The support for Rake File list is being deprecated.
- `preferred_dependency` has been renamed to `default_subspec`.
- Added `exclude_files` attribute.
- Removed `exclude_header_search_paths`
- Added `screenshots` attribute
- Added default values for attributes like `source_files`.
- `requires_arc` is transioning from `false` to `true`.
- The support for Rake File list is being deprecated.
###### Podfile DSL
- It is not needed to specify the platform anymore (unless not integrating)
###### Enhancements
- CocoaPods now has support for working in teams and not committing the Pods folder.
- Released [documentation](docs.cocoapods.org).
- Adds new subcommand `pod spec cat NAME` to print a spec file to standard output.
- Added Podfile to the Pods project.
- The `--no-clean` option of the `pod spec lint` command now displays the Pods project for inspection.
- CocoaPods now can infer the platform from the integrated targets.
- It is now possible to specify default values for the configuration in `~/.cocoapods/config.yaml`.
- CocoaPods now keeps track of the checksum of the specifications of the installed Pods and reinstalls them if needed.
###### Codebase
- Major clean up and refactor to the code base.
- Extrace models to [CocoaPods-Core](https://github.com/CocoaPods/Core) gem.
- Major clean up and refactor of the whole code base, with great reduction of
the technical debt.
- Extracted the models of into
[CocoaPods-Core](https://github.com/CocoaPods/Core) gem.
- Extracted command-line command & option handling into
[CLAide](https://github.com/CocoaPods/CLAide).
- Extracted downloader into
[cocoapods-downloader](https://github.com/CocoaPods/cocoapods-downloader).
- Added PathList class.
[#476](https://github.com/CocoaPods/CocoaPods/issues/476)
- Extracted XCConfig generator.
- Added Analyzer class.
- Added Library class.
- Added XCConfig generator.
## 0.16.0
[CocoaPods](https://github.com/CocoaPods/CocoaPods/compare/0.16.0.rc5...master)
......
......@@ -8,11 +8,11 @@ gemspec
group :development do
gem "cocoapods-core", :git => "git://github.com/CocoaPods/Core.git"
gem "xcodeproj", :git => "git://github.com/CocoaPods/Xcodeproj.git"
# gem "cocoapods-downloader", :git => "git://github.com/CocoaPods/cocoapods-downloader"
gem "cocoapods-downloader", :git => "git://github.com/CocoaPods/cocoapods-downloader"
# gem "cocoapods-core", :path => "../Core"
# gem "xcodeproj", :path => "../Xcodeproj"
gem "cocoapods-downloader", :path => "../cocoapods-downloader"
# gem "cocoapods-downloader", :path => "../cocoapods-downloader"
gem "mocha", "~> 0.11.4"
gem "bacon"
......
GIT
remote: git://github.com/CocoaPods/Core.git
revision: 08e92506fc6600821f066f8cb494f6dcfef348eb
revision: 2acbcad47f48373f515b5c0c5d383d96fed21152
specs:
cocoapods-core (0.17.0.alpha)
activesupport (~> 3.2.6)
......@@ -15,6 +15,12 @@ GIT
activesupport (~> 3.2.6)
colored (~> 1.2)
GIT
remote: git://github.com/CocoaPods/cocoapods-downloader
revision: b349db398d5e9205a67974f70906fec2c7a0e588
specs:
cocoapods-downloader (0.1.0)
GIT
remote: https://github.com/alloy/kicker.git
revision: 6430787ebf8b9305acc2d2f89ae5cf01d2cd5488
......@@ -40,11 +46,6 @@ PATH
rake (~> 0.9.4)
xcodeproj (~> 0.4.0)
PATH
remote: ../cocoapods-downloader
specs:
cocoapods-downloader (0.1.0)
GEM
remote: http://rubygems.org/
specs:
......@@ -87,7 +88,7 @@ GEM
coderay (~> 1.0.5)
method_source (~> 0.8)
slop (~> 3.3.1)
pygments.rb (0.3.2)
pygments.rb (0.3.3)
posix-spawn (~> 0.3.6)
yajl-ruby (~> 1.1.0)
rake (0.9.6)
......
......@@ -19,6 +19,7 @@ module Pod
end
end
autoload :Analyzer, 'cocoapods/analyzer'
autoload :Command, 'cocoapods/command'
autoload :Executable, 'cocoapods/executable'
autoload :ExternalSources, 'cocoapods/external_sources'
......
module Pod
# Analyzes the Podfile, the Lockfile, and the sandbox manifest to generate
# the information relative to a CocoaPods installation.
#
class Analyzer
include Config::Mixin
# @return [Sandbox] The sandbox where the Pods should be installed.
#
attr_reader :sandbox
# @return [Podfile] The Podfile specification that contains the information
# of the Pods that should be installed.
#
attr_reader :podfile
# @return [Lockfile] The Lockfile that stores the information about the
# Pods previously installed on any machine.
#
attr_reader :lockfile
# @param [Sandbox] sandbox @see sandbox
# @param [Podfile] podfile @see podfile
# @param [Lockfile] lockfile @see lockfile
#
def initialize(sandbox, podfile, lockfile = nil)
@sandbox = sandbox
@podfile = podfile
@lockfile = lockfile
@update_mode = false
@allow_pre_downloads = true
end
# Performs the analysis.
#
# The Podfile and the Lockfile provide the information necessary to compute
# which specification should be installed. The manifest of the sandbox
# returns which specifications are installed.
#
# @return [void]
#
def analyze
@podfile_state = generate_podfile_state
update_repositories_if_needed
@libraries = generated_libraries
@locked_dependencies = generate_version_locking_dependencies
@specs_by_target = resolve_dependencies
@specifications = generate_specifications
@sandbox_state = generate_sandbox_state
end
# @return [Bool] Whether an installation should be performed or this
# CocoaPods project is already up to date.
#
def needs_install?
podfile_needs_install? || sandbox_needs_install?
end
# @return [Bool] Whether the podfile has changes respect to the lockfile.
#
def podfile_needs_install?
state = generate_podfile_state
needing_install = state.added + state.changed + state.deleted
!needing_install.empty?
end
# @return [Bool] Whether the sandbox is in synch with the lockfile.
#
def sandbox_needs_install?
lockfile =! sandbox.manifest
end
#-------------------------------------------------------------------------#
# @!group Configuration
# @return [Bool] Whether the version of the dependencies which did non
# change in the Podfile should be locked.
#
attr_accessor :update_mode
alias_method :update_mode?, :update_mode
# @return [Bool] Whether the analysis allows predownloads and thus
# modifications to the sandbox.
#
# @note This is used by the `pod outdated` command to prevent
# modification of the sandbox in the resolution process.
#
attr_accessor :allow_pre_downloads
alias_method :allow_pre_downloads?, :allow_pre_downloads
#-------------------------------------------------------------------------#
# @!group Analysis products
public
# @return [SpecsState] the states of the Podfile specs.
#
attr_reader :podfile_state
# @return [Hash{TargetDefinition => Array<Spec>}] the specifications
# grouped by target.
#
attr_reader :specs_by_target
# @return [Array<Specification>] the specifications of the resolved version
# of Pods that should be installed.
#
attr_reader :specifications
# @return [SpecsState] the states of the {Sandbox} respect the resolved
# specifications.
#
attr_reader :sandbox_state
# @return [Array<Library>] the libraries generated by the target
# definitions.
#
attr_reader :libraries
#-------------------------------------------------------------------------#
# @!group Analysis steps
private
# Compares the {Podfile} with the {Lockfile} in order to detect which
# dependencies should be locked.
#
# @return [SpecsState] the states of the Podfile specs.
#
# @note As the target definitions share the same sandbox they should have
# the same version of a Pod. For this reason this method returns
# the name of the Pod (root name of the dependencies) and doesn't
# group them by target definition.
#
# @todo [CocoaPods > 0.18] If there isn't a Lockfile all the Pods should
# be marked as added.
#
def generate_podfile_state
if lockfile
pods_state = nil
UI.section "Finding Podfile changes:" do
pods_by_state = lockfile.detect_changes_with_podfile(podfile)
pods_by_state.dup.each do |state, full_names|
pods_by_state[state] = full_names.map { |fn| Specification.root_name(fn) }
end
pods_state = SpecsState.new(pods_by_state)
pods_state.print
end
pods_state
else
SpecsState.new({})
end
end
# Updates the source repositories unless the config indicates to skip it.
#
# @return [void]
#
def update_repositories_if_needed
unless config.skip_repo_update?
UI.section 'Updating spec repositories' do
SourcesManager.update
end
end
end
# Creates the models that represent the libraries generated by CocoaPods.
#
# @note The libraries are generated before the resolution process because
# it might be necessary to infer the platform from the user
# targets, which in turns requires to identify the user project.
#
# @note The specification of the libraries are added in the
# {#resolve_dependencies} step.
#
# @return [Array<Libraries>] the generated libraries.
#
def generated_libraries
libraries = []
podfile.target_definitions.values.each do |target_definition|
lib = Library.new(target_definition)
lib.support_files_root = config.sandbox.root
if config.integrate_targets?
lib.user_project_path = compute_user_project_path(target_definition)
lib.user_project = Xcodeproj::Project.new(lib.user_project_path)
lib.user_targets = compute_user_project_targets(target_definition, lib.user_project)
lib.user_build_configurations = compute_user_build_configurations(target_definition, lib.user_targets)
lib.platform = compute_platform_for_target_definition(target_definition, lib.user_targets)
else
lib.user_project_path = config.project_root
lib.user_project = nil
lib.user_targets = []
lib.user_build_configurations = {}
lib.platform = target_definition.platform
raise Informative "It is necessary to specify the platform in the Podfile if not integrating." unless target_definition.platform
end
libraries << lib
end
libraries
end
# Generates dependencies that require the specific version of the Pods that
# haven't changed in the {Lockfile}.
#
# These dependencies are passed to the {Resolver}, unless the installer is
# in update mode, to prevent it from upgrading the Pods that weren't
# changed in the {Podfile}.
#
# @return [Array<Dependency>] the dependencies generate by the lockfile
# that prevent the resolver to update a Pod.
#
def generate_version_locking_dependencies
return [] if update_mode?
podfile_state.unchanged.map do |pod|
lockfile.dependency_to_lock_pod_named(pod)
end
end
# Converts the Podfile in a list of specifications grouped by target.
#
# @note In this step the specs are added to the libraries.
#
# @note As some dependencies might have external sources the resolver is
# aware of the {Sandbox} and interacts with it to download the
# podspecs of the external sources. This is necessary because the
# resolver needs the specifications to analyze their dependencies
# (which might be from external sources).
#
# @note In update mode the resolver is set to always update the specs
# from external sources.
#
# @return [Hash{TargetDefinition => Array<Spec>}] the specifications
# grouped by target.
#
def resolve_dependencies
specs_by_target = nil
UI.section "Resolving dependencies of #{UI.path podfile.defined_in_file}" do
resolver = Resolver.new(sandbox, podfile, locked_dependencies)
resolver.update_external_specs = update_mode?
resolver.allow_pre_downloads = allow_pre_downloads?
specs_by_target = resolver.resolve
end
specs_by_target.each do |target_definition, specs|
lib = libraries.find { |l| l.target_definition == target_definition}
lib.specs = specs
end
specs_by_target
end
# Returns the list of all the resolved the resolved specifications.
#
# @return [Array<Specification>] the list of the specifications.
#
def generate_specifications
specs_by_target.values.flatten.uniq
end
# Computes the state of the sandbox respect to the resolved specifications.
#
# The logic is the following:
#
# Added
# - If not present in the sandbox lockfile.
#
# Changed
# - The version of the Pod changed.
# - The specific installed (sub)specs of the same Pod changed.
# - The SHA of the specification file changed.
#
# Removed
# - If a specification is present in the lockfile but not in the resolved
# specs.
#
# Unchanged
# - If none of the above conditions match.
#
# @todo [CocoaPods > 0.18] Version 0.17 falls back to the lockfile of the
# Podfile for the sandbox manifest to prevent the full
# re-installation for upgrading users (this was the old behaviour
# pre sandbox manifest) of all the pods. Drop in 0.18.
#
# @return [SpecsState] the representation of the state of the manifest
# specifications.
#
def generate_sandbox_state
sandbox_lockfile = sandbox.manifest || lockfile
sandbox_state = SpecsState.new
UI.section "Comparing resolved specification to the sandbox manifest:" do
resolved_subspecs_names = specifications.group_by { |s| s.root.name }
resolved_names = resolved_subspecs_names.keys
if sandbox_lockfile
sandbox_subspecs_names = sandbox_lockfile.pod_names.group_by { |name| Specification.root_name(name) }
sandbox_names = sandbox_subspecs_names.keys
all_names = (resolved_names + sandbox_names).uniq
root_specs = specifications.map(&:root).uniq
is_changed = lambda do |name|
spec = root_specs.find { |spec| spec.name == name }
spec.version != sandbox_lockfile.version(name) \
|| spec.checksum != sandbox_lockfile.version(name) \
|| resolved_subspecs_names[name] =! sandbox_subspecs_names[name] \
end
all_names.each do |name|
state = case
when resolved_names.include?(name) && !sandbox_names.include?(name) then :added
when !resolved_names.include?(name) && sandbox_names.include?(name) then :deleted
when is_changed.call(name) then :changed
else :unchanged
end
sandbox_state.add_name(name, state)
end
else
sandbox_state.added.concat(resolved_names)
end
sandbox_state.print
end
sandbox_state
end
#-------------------------------------------------------------------------#
# @!group Analysis internal products
# @return [Array<Dependency>] the dependencies generate by the lockfile
# that prevent the resolver to update a Pod.
#
attr_reader :locked_dependencies
#-------------------------------------------------------------------------#
private
# @!group Analysis sub-steps
# Returns the path of the user project that the {TargetDefinition}
# should integrate.
#
# @raise If the project is implicit and there are multiple projects.
#
# @raise If the path doesn't exits.
#
# @return [Pathname] the path of the user project.
#
def compute_user_project_path(target_definition)
if target_definition.user_project_path
user_project_path = Pathname.new(config.project_root + target_definition.user_project_path)
user_project_path = user_project_path.sub_ext '.xcodeproj'
unless user_project_path.exist?
raise Informative, "Unable to find the Xcode project " \
"`#{user_project_path}` for the target `#{target_definition.label}`."
end
else
xcodeprojs = Pathname.glob(config.project_root + '*.xcodeproj')
if xcodeprojs.size == 1
user_project_path = xcodeprojs.first
else
raise Informative, "Could not automatically select an Xcode project. " \
"Specify one in your Podfile like so:\n\n" \
" xcodeproj 'path/to/Project.xcodeproj'\n"
end
end
user_project_path
end
# Returns a list of the targets from the project of {TargetDefinition}
# that needs to be integrated.
#
# @note The method first looks if there is a target specified with
# the `link_with` option of the {TargetDefinition}. Otherwise
# it looks for the target that has the same name of the target
# definition. Finally if no target was found the first
# encountered target is returned (it is assumed to be the one
# to integrate in simple projects).
#
# @note This will only return targets that do **not** already have
# the Pods library in their frameworks build phase.
#
#
def compute_user_project_targets(target_definition, user_project)
if link_with = target_definition.link_with
targets = user_project.targets.select { |t| link_with.include? t.name }
raise Informative, "Unable to find the targets named `#{link_with.to_sentence}` to link with target definition `#{target_definition.name}`" if targets.empty?
elsif target_definition.name != :default
target = user_project.targets.find { |t| t.name == target_definition.name.to_s }
targets = [ target ].compact
raise Informative, "Unable to find a target named `#{target_definition.name.to_s}`" if targets.empty?
else
targets = [ user_project.targets.first ].compact
raise Informative, "Unable to find a target" if targets.empty?
end
targets
end
# @return [Hash{String=>Symbol}] A hash representing the user build
# configurations where each key corresponds to the name of a
# configuration and its value to its type (`:debug` or `:release`).
#
def compute_user_build_configurations(target_definition, user_targets)
if user_targets
user_targets.map { |t| t.build_configurations.map(&:name) }.flatten.inject({}) do |hash, name|
unless name == 'Debug' || name == 'Release'
hash[name] = :release
end
hash
end.merge(target_definition.build_configurations || {})
else
target_definition.build_configurations || {}
end
end
# @return [Platform] The platform for the library.
#
# @note This resolves to the lowest deployment target across the user
# targets.
#
# @todo Is assigning the platform to the target definition the best way
# to go?
#
def compute_platform_for_target_definition(target_definition, user_targets)
return target_definition.platform if target_definition.platform
name = nil
deployment_target = nil
user_targets.each do |target|
name ||= target.platform_name
raise Informative, "Targets with different platforms" unless name == target.platform_name
if !deployment_target || deployment_target > Version.new(target.deployment_target)
deployment_target = Version.new(target.deployment_target)
end
end
platform = Platform.new(name, deployment_target)
target_definition.platform = platform
platform
end
#-------------------------------------------------------------------------#
# This class represents the state of a collection of Pods.
#
# @note The names of the pods stored by this class are always the **root**
# name of the specification.
#
class SpecsState
# @param [Hash{Symbol=>String}] pods_by_state
# The **root** name of the pods grouped by their state (`:added`,
# `:removed`, `:changed` or `:unchanged`).
#
def initialize(pods_by_state = nil)
@added = []
@deleted = []
@changed = []
@unchanged = []
if pods_by_state
@added = pods_by_state[:added] || []
@deleted = pods_by_state[:removed] || []
@changed = pods_by_state[:changed] || []
@unchanged = pods_by_state[:unchanged] || []
end
end
# @return [Array<String>] the names of the pods that were added.
#
attr_accessor :added
# @return [Array<String>] the names of the pods that were changed.
#
attr_accessor :changed
# @return [Array<String>] the names of the pods that were deleted.
#
attr_accessor :deleted
# @return [Array<String>] the names of the pods that were unchanged.
#
attr_accessor :unchanged
# Displays the state of each pod.
#
# @return [void]
#
def print
added .each { |pod| UI.message("A".green + " #{pod}", '', 2) }
deleted .each { |pod| UI.message("R".red + " #{pod}", '', 2) }
changed .each { |pod| UI.message("M".yellow + " #{pod}", '', 2) }
unchanged.each { |pod| UI.message("-" + " #{pod}", '', 2) }
end
# Adds the name of a Pod to the give state.
#
# @parm [String]
# the name of the Pod.
#
# @parm [Symbol]
# the state of the Pod.
#
# @raise If there is an attempt to add the name of a subspec.
#
# @return [void]
#
def add_name(name, state)
raise "[Bug] Attempt to add subspec to the pods state" if name.include?('/')
self.send(state) << name
end
end
end
end
require 'open4'
module Pod
# Module which provides support for running executables.
......@@ -25,6 +23,7 @@ module Pod
# @return [void]
#
def executable(name)
bin = `which #{name}`.strip
raise Informative, "Unable to locate the executable `#{name}`" if bin.empty?
......@@ -55,6 +54,8 @@ module Pod
# @todo Find a way to display the live output of the commands.
#
def self.execute_command(bin, command, raise_on_failure = false)
require 'open4'
full_command = "#{bin} #{command}"
if Config.instance.verbose?
......
module Pod
# The {Installer} is the core of CocoaPods. This class is responsible of
# taking a Podfile and transform it in the Pods libraries. This class also
# integrates the user project so the Pods libraries can be used out of the
# box.
# The Installer is responsible of taking a Podfile and transform it in the
# Pods libraries. It also integrates the user project so the Pods
# libraries can be used out of the box.
#
# The Installer is capable of doing incremental updates to an existing Pod
# installation.
......@@ -22,32 +21,10 @@ module Pod
# is already installed. This file is not intended to be kept under source
# control and is a copy of the Podfile.lock.
#
# Once completed the installer should produce the following file structure:
#
# Pods
# |
# +-- Headers
# | +-- Build
# | | +-- [Pod Name]
# | +-- Public
# | +-- [Pod Name]
# |
# +-- Sources
# | +-- [Pod Name]
# |
# +-- Specifications
# |
# +-- Target Support Files
# | +-- [Target Name]
# | +-- Acknowledgements.markdown
# | +-- Acknowledgements.plist
# | +-- Pods.xcconfig
# | +-- Pods-prefix.pch
# | +-- PodsDummy_Pods.m
# |
# +-- Manifest.lock
# |
# +-- Pods.xcodeproj
# The Installer is designed to work in environments where the Podfile folder
# is under source control and environments where it is not. The rest of the
# files, like the user project and the workspace are assumed to be under
# source control.
#
class Installer
autoload :TargetInstaller, 'cocoapods/installer/target_installer'
......@@ -104,6 +81,9 @@ module Pod
#
def install!
analyze
generate_local_pods
generate_names_of_pods_to_install
prepare_for_legacy_compatibility
clean_global_support_files
clean_removed_pods
......@@ -114,410 +94,109 @@ module Pod
integrate_user_project
end
# Performs only the computation parts of an installation.
#
# It is used by the `outdated` subcommand.
#
# @return [void]
#
def analyze
create_libraries
generate_pods_by_podfile_state
update_repositories_if_needed
generate_locked_dependencies
resolve_dependencies
generate_local_pods
generate_pods_that_should_be_installed
end
#---------------------------------------------------------------------------#
#-------------------------------------------------------------------------#
# @!group Analysis products
# @!group Installation products
public
# @return [Array<String>]
# the names of the pods that were added to Podfile since the last
# installation on any machine.
#
attr_reader :pods_added_from_the_lockfile
# @return [Array<String>]
# the names of the pods whose version requirements in the Podfile are
# incompatible with the version stored in the lockfile.
#
attr_reader :pods_changed_from_the_lockfile
# @return [Array<String>]
# the names of the pods that were deleted from Podfile since the last
# installation on any machine.
#
attr_reader :pods_deleted_from_the_lockfile
# @return [Array<String>]
# the names of the pods that didn't change since the last installation on
# any machine.
#
attr_reader :pods_unchanged_from_the_lockfile
# @return [Array<Dependency>]
# the dependencies generate by the lockfile that prevent the resolver to
# update a Pod.
# @return [Analyzer] the analyzer which provides the information about what
# needs to be installed.
#
attr_reader :locked_dependencies
attr_reader :analyzer
# @return [Hash{TargetDefinition => Array<Spec>}]
# the specifications grouped by target as identified in the
# resolve_dependencies step.
# @return [Pod::Project] the `Pods/Pods.xcodeproj` project.
#
attr_reader :specs_by_target
attr_reader :pods_project
# @return [Array<Specification>]
# the specifications of the resolved version of Pods that should be
# installed.
# @return [Array<TargetInstaller>]
#
attr_reader :specifications
attr_reader :target_installers
# @return [Hash{TargetDefinition => Array<LocalPod>}]
# the local pod instances grouped by target.
# @return [Hash{TargetDefinition => Array<LocalPod>}] The local pod
# instances grouped by target.
#
attr_reader :local_pods_by_target
# @return [Array<LocalPod>]
# the list of LocalPod instances for each dependency sorted by name.
# @return [Array<LocalPod>] The list of LocalPod instances for each
# dependency sorted by name.
#
attr_reader :local_pods
# @return [Array<String>]
# the Pods that should be installed.
# @return [Array<String>] The Pods that should be installed.
#
attr_reader :pods_to_install
#---------------------------------------------------------------------------#
# @!group Installation products
public
# @return [Pod::Project]
# the `Pods/Pods.xcodeproj` project.
#
attr_reader :pods_project
# @return [Array<TargetInstaller>]
#
attr_reader :target_installers
attr_reader :names_of_pods_to_install
#-------------------------------------------------------------------------#
# @!group Pre-installation computations
attr_reader :libraries
# @!group Installation steps
private
def create_libraries
@libraries = []
podfile.target_definitions.values.each do |target_definition|
lib = Library.new(target_definition)
lib.support_files_root = config.sandbox.root
if config.integrate_targets?
lib.user_project_path = compute_user_project_path(target_definition)
lib.user_project = Xcodeproj::Project.new(lib.user_project_path)
lib.user_targets = compute_user_project_targets(target_definition, lib.user_project)
lib.user_build_configurations = compute_user_build_configurations(target_definition, lib.user_targets)
lib.platform = compute_platform_for_taget_definition(target_definition, lib.user_targets)
else
lib.user_project_path = config.project_root
lib.user_targets = []
lib.user_build_configurations = {}
lib.platform = target_definition.platform
end
@libraries << lib
end
end
####################################################################################################
# Returns the path of the user project that the {TargetDefinition}
# should integrate.
#
# @raise If the project is implicit and there are multiple projects.
#
# @raise If the path doesn't exits.
#
# @return [Pathname] the path of the user project.
#
def compute_user_project_path(target_definition)
if target_definition.user_project_path
user_project_path = Pathname.new(config.project_root + target_definition.user_project_path)
unless user_project_path.exist?
raise Informative, "Unable to find the Xcode project `#{user_project_path}` for the target `#{target_definition.label}`."
end
else
xcodeprojs = Pathname.glob(config.project_root + '*.xcodeproj')
if xcodeprojs.size == 1
user_project_path = xcodeprojs.first
else
raise Informative, "Could not automatically select an Xcode project. " \
"Specify one in your Podfile like so:\n\n" \
" xcodeproj 'path/to/Project.xcodeproj'\n"
end
end
user_project_path
end
# Returns a list of the targets from the project of {TargetDefinition}
# that needs to be integrated.
#
# @note The method first looks if there is a target specified with
# the `link_with` option of the {TargetDefinition}. Otherwise
# it looks for the target that has the same name of the target
# definition. Finally if no target was found the first
# encountered target is returned (it is assumed to be the one
# to integrate in simple projects).
#
# @note This will only return targets that do **not** already have
# the Pods library in their frameworks build phase.
#
#
def compute_user_project_targets(target_definition, user_project)
return [] unless user_project
if link_with = target_definition.link_with
targets = user_project.targets.select { |t| link_with.include? t.name }
raise Informative, "Unable to find a target named `#{link_with.to_sentence}` to link with target definition `#{target_definition.name}`" if targets.empty?
elsif target_definition.name != :default
target = user_project.targets.find { |t| t.name == target_definition.name.to_s }
targets = [ target ].compact
raise Informative, "Unable to find a target named `#{target_definition.name.to_s}`" if targets.empty?
else
targets = [ user_project.targets.first ].compact
raise Informative, "Unable to find a target" if targets.empty?
end
targets
end
# @todo Robustness for installations without integration.
#
def compute_user_build_configurations(target_definition, user_targets)
if user_targets
user_targets.map { |t| t.build_configurations.map(&:name) }.flatten.inject({}) do |hash, name|
unless name == 'Debug' || name == 'Release'
hash[name] = :release
end
hash
end.merge(target_definition.build_configurations || {})
else
target_definition.build_configurations || {}
end
def analyze
@analyzer = Analyzer.new(sandbox, podfile, lockfile)
@analyzer.update_mode = update_mode
@analyzer.analyze
end
# Returns the platform for the library.
#
# @note This resolves to the lowest deployment target across the user targets.
#
# @todo Is assigning the platform to the target definition the best way to
# go?
# Converts the specifications produced by the Resolver in local pods.
#
def compute_platform_for_taget_definition(target_definition, user_targets)
return target_definition.platform if target_definition.platform
if user_targets
name = nil
deployment_target = nil
user_targets.each do |target|
name ||= target.platform_name
raise "Targets with different platforms" unless name == target.platform_name
if !deployment_target || deployment_target > Version.new(target.deployment_target)
deployment_target = Version.new(target.deployment_target)
end
end
platform = Platform.new(name, deployment_target)
target_definition.platform = platform
else
raise Informative, "Missing platform for #{target_definition}."\
"If no integrating it is necessary to specify a platform."
end
platform
end
####################################################################################################
# Compares the {Podfile} with the {Lockfile} in order to detect which
# dependencies should be locked.
# The LocalPod class is responsible to handle the concrete representation
# of a specification in the {Sandbox}.
#
# @return [void]
#
# @todo If there is not Lockfile all the Pods should be marked as added.
# @todo [#535] LocalPods should resolve the specification passing the
# library.
#
# @todo Once the manifest.lock is implemented only the unchanged pods
# should be tracked.
# @todo Why the local pods are generated by the sandbox? I guess because
# some where pre-downloaded? However the sandbox should just store
# the name of those Pods.
#
def generate_pods_by_podfile_state
if lockfile
UI.section "Finding added, modified or removed dependencies:" do
pods_by_state = lockfile.detect_changes_with_podfile(podfile)
@pods_added_from_the_lockfile = pods_by_state[:added] || []
@pods_deleted_from_the_lockfile = pods_by_state[:removed] || []
@pods_changed_from_the_lockfile = pods_by_state[:changed] || []
@pods_unchanged_from_the_lockfile = pods_by_state[:unchanged] || []
display_pods_by_lockfile_state
end
def generate_local_pods
@local_pods_by_target = {}
analyzer.specs_by_target.each do |target_definition, specs|
@local_pods_by_target[target_definition] = specs.map do |spec|
if spec.local?
sandbox.locally_sourced_pod_for_spec(spec, target_definition.platform)
else
@pods_added_from_the_lockfile = []
@pods_deleted_from_the_lockfile = []
@pods_changed_from_the_lockfile = []
@pods_unchanged_from_the_lockfile = []
end
end
# Displays the state of each dependency.
#
# @return [void]
#
def display_pods_by_lockfile_state
return unless config.verbose?
pods_added_from_the_lockfile .each { |pod| UI.message("A".green + "#{pod}", '', 2) }
pods_deleted_from_the_lockfile .each { |pod| UI.message("R".red + "#{pod}", '', 2) }
pods_changed_from_the_lockfile .each { |pod| UI.message("M".yellow + "#{pod}", '', 2) }
pods_unchanged_from_the_lockfile .each { |pod| UI.message("-" + "#{pod}", '', 2) }
end
# Lazily updates the source repositories. The update is triggered if:
#
# - There are pods that changed in the Podfile.
# - The lockfile is missing.
# - The installer is in update_mode.
#
# @todo Remove the lockfile condition once compare_podfile_and_lockfile
# is updated.
#
# @todo Lazy resolution can't be done if we want to fully support detection
# of changes in specifications checksum.
#
# @return [void]
#
def update_repositories_if_needed
return if config.skip_repo_update?
changed_pods = (pods_changed_from_the_lockfile + pods_deleted_from_the_lockfile)
should_update = !lockfile || !changed_pods.empty? || update_mode
if should_update
UI.section 'Updating spec repositories' do
SourcesManager.update
end
end
end
# Generates dependencies that require the specific version of the Pods that
# haven't changed in the {Lockfile}.
#
# These dependencies are passed to the {Resolver}, unless the installer is
# in update mode, to prevent it from upgrading the Pods that weren't
# changed in the {Podfile}.
#
# @return [void]
#
def generate_locked_dependencies
@locked_dependencies = pods_unchanged_from_the_lockfile.map do |pod|
lockfile.dependency_to_lock_pod_named(pod)
sandbox.local_pod_for_spec(spec, target_definition.platform)
end
end.uniq.compact
end
# Converts the Podfile in a list of specifications grouped by target.
#
# @note As some dependencies might have external sources the resolver is
# aware of the {Sandbox} and interacts with it to download the
# podspecs of the external sources. This is necessary because the
# resolver needs the specifications to analyze their dependencies
# (which might be from external sources).
#
# @note In update mode the resolver is set to always update the specs
# from external sources.
#
# @return [void]
#
def resolve_dependencies
UI.section "Resolving dependencies of #{UI.path podfile.defined_in_file}" do
locked_deps = update_mode ? [] : locked_dependencies
resolver = Resolver.new(sandbox, podfile, locked_deps)
resolver.update_external_specs = update_mode
@specs_by_target = resolver.resolve
@specifications = specs_by_target.values.flatten
end
@local_pods = local_pods_by_target.values.flatten.uniq.sort_by { |pod| pod.name.downcase }
end
# Computes the list of the Pods that should be installed or reinstalled in
# the {Sandbox}.
#
# The pods to install are identified as the Pods that don't exist in the
# sandbox or the Pods whose version differs from the one of the lockfile.
#
# In update mode specs originating from external dependencies and or from
# head sources are always reinstalled.
# @note In update mode specs originating from external dependencies and
# or from head sources are always reinstalled.
#
# @return [void]
#
# @todo Use {Sandbox} manifest.
# @todo [#534] Detect if the folder of a Pod is empty (even if it exits).
#
# @todo [#534] Detect if the folder of a Pod is empty.
# @todo There could be issues with the current implementation regarding
# external specs.
#
def generate_pods_that_should_be_installed
def generate_names_of_pods_to_install
changed_pods_names = []
if lockfile
changed_pods = local_pods.select do |pod|
pod.top_specification.version != lockfile.pod_versions[pod.name]
end
if update_mode
changed_pods_names += pods.select do |pods|
pod.top_specification.version.head? ||
resolver.pods_from_external_sources.include?(pod.name)
end
end
changed_pods_names += pods_added_from_the_lockfile + pods_changed_from_the_lockfile
else
changed_pods = local_pods
end
changed_pods_names += analyzer.sandbox_state.added + analyzer.sandbox_state.changed
not_existing_pods = local_pods.reject { |pod| pod.exists? }
@pods_to_install = (changed_pods + not_existing_pods).uniq
@names_of_pods_to_install = (changed_pods_names + not_existing_pods.map(&:name)).uniq
end
# Converts the specifications produced by the Resolver in local pods.
#
# The LocalPod class is responsible to handle the concrete representation
# of a specification in the {Sandbox}.
#
# @return [void]
#
# @todo [#535] Pods should be accumulated per Target, also in the Local
# Pod class. The Local Pod class should have a method to add itself
# to a given project so it can use the sources of all the activated
# podspecs across all targets. Also cleaning should take into account
# that.
#
def generate_local_pods
@local_pods_by_target = {}
specs_by_target.each do |target_definition, specs|
@local_pods_by_target[target_definition] = specs.map do |spec|
if spec.local?
sandbox.locally_sourced_pod_for_spec(spec, target_definition.platform)
else
sandbox.local_pod_for_spec(spec, target_definition.platform)
end
end.uniq.compact
end
@local_pods = local_pods_by_target.values.flatten.uniq.sort_by { |pod| pod.name.downcase }
end
#---------------------------------------------------------------------------#
# @!group Installation
private
# Prepares the Pods folder in order to be compatible with the most recent
# version of CocoaPods.
#
......@@ -525,13 +204,13 @@ module Pod
#
def prepare_for_legacy_compatibility
# move_target_support_files_if_needed
# copy_lock_file_to_Pods_lock_if_needed
# move_Local_Podspecs_to_Podspecs_if_needed
# move_pods_to_sources_folder_if_needed
end
# @return [void] In this step we clean all the folders that will be
# regenerated from scratch and any file which might not be overwritten.
# regenerated from scratch and any file which might not be
# overwritten.
#
# @todo Clean the podspecs of all the pods that aren't unchanged so the
# resolution process doesn't get confused by them.
......@@ -544,6 +223,7 @@ module Pod
# Pods.
#
# @todo Use the local pod implode.
#
# @todo [#534] Clean all the Pods folder that are not unchanged?
#
def clean_removed_pods
......@@ -554,12 +234,12 @@ module Pod
path.rmtree if path.exist?
end
end
end unless pods_deleted_from_the_lockfile.empty?
end unless analyzer.sandbox_state.deleted.empty?
end
# @return [void] In this step we clean the files of the Pods that will be
# installed. We clean the files that might affect the resolution process
# and the files that might not be overwritten.
# installed. We clean the files that might affect the resolution
# process and the files that might not be overwritten.
#
# @todo [#247] Clean the headers of only the pods to install.
#
......@@ -568,13 +248,14 @@ module Pod
end
# @return [void] Install the Pods. If the resolver indicated that a Pod
# should be installed and it exits, it is removed an then reinstalled. In
# any case if the Pod doesn't exits it is installed.
# should be installed and it exits, it is removed an then
# reinstalled. In any case if the Pod doesn't exits it is
# installed.
#
def install_dependencies
UI.section "Downloading dependencies" do
local_pods.each do |pod|
if pods_to_install.include?(pod)
if names_of_pods_to_install.include?(pod.name)
UI.section("Installing #{pod}".green, "-> ".green) do
install_local_pod(pod)
end
......@@ -669,7 +350,6 @@ module Pod
# Creates the Pods project from scratch if it doesn't exists.
#
# @todo Restore the build configuration support.
# @todo Clean and modify the project if it exists.
#
# @return [void]
......@@ -690,7 +370,7 @@ module Pod
def generate_target_installers
@target_installers = podfile.target_definitions.values.map do |definition|
pods_for_target = local_pods_by_target[definition]
libray = libraries.find {|l| l.target_definition == definition }
libray = analyzer.libraries.find {|l| l.target_definition == definition }
TargetInstaller.new(pods_project, libray, pods_for_target) unless definition.empty?
end.compact
end
......@@ -709,7 +389,7 @@ module Pod
# so they are visible for the user.
#
def add_source_files_to_pods_project
UI.message "- Adding source files to Pods project" do
UI.message "- Adding Pods files to Pods project" do
local_pods.each { |p| p.add_file_references_to_project(pods_project) }
local_pods.each { |p| p.link_headers }
end
......@@ -718,6 +398,7 @@ module Pod
# Runs the pre install hooks of the installed specs and of the Podfile.
#
# @todo Run the hooks only for the installed pods.
#
# @todo Print a message with the names of the specs.
#
# @return [void]
......@@ -736,6 +417,7 @@ module Pod
# Runs the post install hooks of the installed specs and of the Podfile.
#
# @todo Run the hooks only for the installed pods.
#
# @todo Print a message with the names of the specs.
#
# @return [void]
......@@ -743,7 +425,7 @@ module Pod
def run_post_install_hooks
UI.message "- Running post install hooks" do
target_installers.each do |target_installer|
specs_by_target[target_installer.library.target_definition].each do |spec|
target_installer.library.specs.each do |spec|
spec.post_install!(target_installer)
end
end
......@@ -798,17 +480,16 @@ module Pod
#
# @return [void]
#
# @todo [#552] Implement manifest.
#
def write_lockfiles
@lockfile = Lockfile.generate(podfile, specs_by_target.values.flatten)
@lockfile = Lockfile.generate(podfile, analyzer.specifications)
UI.message "- Writing Lockfile in #{UI.path config.lockfile_path}" do
@lockfile.write_to_disk(config.lockfile_path)
end
# UI.message "- Writing Manifest in #{UI.path sandbox.manifest_path}" do
# @lockfile.write_to_disk(sandbox.manifest_path)
# end
UI.message "- Writing Manifest in #{UI.path sandbox.manifest_path}" do
@lockfile.write_to_disk(sandbox.manifest_path)
end
end
# Integrates the user project.
......@@ -821,16 +502,16 @@ module Pod
# @return [void]
#
# @todo [#397] The libraries should be cleaned and the re-added on every
# installation. Maybe a clean_user_project phase should be added. In
# any case it appears to be a good idea store target definition
# installation. Maybe a clean_user_project phase should be added.
# In any case it appears to be a good idea store target definition
# information in the lockfile.
#
# @todo [#588] The resources should be added through a build phase instead
# of using a script.
# @todo [#588] The resources should be added through a build phase
# instead of using a script.
#
def integrate_user_project
return unless config.integrate_targets?
UserProjectIntegrator.new(podfile, pods_project, config.project_root, libraries).integrate!
UserProjectIntegrator.new(podfile, pods_project, config.project_root, analyzer.libraries).integrate!
end
end
end
......@@ -78,9 +78,9 @@ module Pod
#
attr_accessor :xcconfig
# @todo This is currently unused.
# @return [Array<Specification>] the specifications of this library.
#
attr_accessor :specifications
attr_accessor :specs
#-------------------------------------------------------------------------#
......
require 'open-uri'
# Allow open-uri to follow http to https redirects.
# Allow OpenURI to follow http to https redirects.
#
module OpenURI
......
......@@ -7,9 +7,6 @@ module Pod
# automatic resolves like Bundler:
# [how-does-bundler-bundle](http://patshaughnessy.net/2011/9/24/how-does-bundler-bundle)
#
# Another important aspect to keep in mind of the current implementation
# is that the order of the dependencies matters.
#
class Resolver
include Config::Mixin
......@@ -28,6 +25,16 @@ module Pod
#
attr_reader :locked_dependencies
# @param [Sandbox] sandbox @see sandbox
# @param [Podfile] podfile @see podfile
# @param [Array<Dependency>] locked_dependencies @see locked_dependencies
#
def initialize(sandbox, podfile, locked_dependencies = [])
@sandbox = sandbox
@podfile = podfile
@locked_dependencies = locked_dependencies
end
# @return [Bool] whether the resolver should update the external specs
# in the resolution process. This option is used for detecting
# changes in with the Podfile without affecting the existing Pods
......@@ -35,21 +42,11 @@ module Pod
#
# @note This option is used by `pod outdated`.
#
# @todo This implementation is not clean, because if the spec doesn't
# exists the sandbox will actually download and modify the
# installation.
#
attr_accessor :update_external_specs
# @param [Sandbox] sandbox @see sandbox
# @param [Podfile] podfile @see podfile
# @param [Array<Dependency>] locked_dependencies @see locked_dependencies
# @todo Implement non destructive resolution.
#
def initialize(sandbox, podfile, locked_dependencies = [])
@sandbox = sandbox
@podfile = podfile
@locked_dependencies = locked_dependencies
end
attr_accessor :allow_pre_downloads
#-------------------------------------------------------------------------#
......@@ -112,8 +109,6 @@ module Pod
# @return [Source::Aggregate] A cache of the sources needed to find the
# podspecs.
#
# @todo Cache the sources globally?
#
attr_accessor :cached_sources
# @return [Hash<String => Set>] A cache that keeps tracks of the sets
......
......@@ -8,6 +8,36 @@ module Pod
#
# CocoaPods assumes to have control of the sandbox.
#
# Once completed the sandbox will have the following file structure:
#
# Pods
# |
# +-- Headers
# | +-- Build
# | | +-- [Pod Name]
# | +-- Public
# | +-- [Pod Name]
# |
# +-- Sources
# | +-- [Pod Name]
# |
# +-- Specifications
# |
# +-- Target Support Files
# | +-- [Target Name]
# | +-- Acknowledgements.markdown
# | +-- Acknowledgements.plist
# | +-- Pods.xcconfig
# | +-- Pods-prefix.pch
# | +-- PodsDummy_Pods.m
# |
# +-- Manifest.lock
# |
# +-- Pods.xcodeproj
#
# @todo outdated pods triggers the resolution process which might pre-download
# some pods.
#
class Sandbox
# The path of the build headers directory relative to the root.
......@@ -81,6 +111,20 @@ module Pod
#--------------------------------------#
# @!group Manifest
public
def manifest_path
root + "Manifest.lock"
end
def manifest
Lockfile.from_file(manifest_path) if manifest_path.exist?
end
#--------------------------------------#
# @!group Local Pod support
public
......
......@@ -233,8 +233,10 @@ module Pod
UI.puts message
end
# @todo enable in CocoaPods 0.17.0 release
#
def warn(message)
UI.warn message
# UI.warn message
end
end
end
......
require File.expand_path('../../spec_helper', __FILE__)
# @return [Analyzer] the sample analyzer.
#
def create_analyzer
podfile = Pod::Podfile.new do
platform :ios, '6.0'
xcodeproj 'SampleProject/SampleProject'
pod 'JSONKit', '1.5pre'
pod 'AFNetworking', '1.0.1'
pod 'SVPullToRefresh', '0.4'
pod 'libextobjc/EXTKeyPathCoding', '0.2.3'
end
hash = {}
hash['PODS'] = ["JSONKit (1.4)", "NUI (0.2.0)", "SVPullToRefresh (0.4)"]
hash['DEPENDENCIES'] = ["JSONKit", "NUI", "SVPullToRefresh"]
hash['SPEC CHECKSUMS'] = {}
hash['COCOAPODS'] = Pod::VERSION
lockfile = Pod::Lockfile.new(hash)
SpecHelper.create_sample_app_copy_from_fixture('SampleProject')
analyzer = Pod::Analyzer.new(config.sandbox, podfile, lockfile)
end
#-----------------------------------------------------------------------------#
module Pod
describe Analyzer do
before do
@analyzer = create_analyzer
end
describe "Analysis" do
it "returns whether an installation should be performed" do
@analyzer.needs_install?.should.be.true
end
it "returns whether the Podfile has changes" do
@analyzer.podfile_needs_install?.should.be.true
end
it "returns whether the sandbox is not in sync with the lockfile" do
@analyzer.sandbox_needs_install?.should.be.true
end
#--------------------------------------#
it "computes the state of the Podfile respect to the Lockfile" do
@analyzer.analyze
state = @analyzer.podfile_state
state.added.should == ["AFNetworking", "libextobjc"]
state.changed.should == ["JSONKit"]
state.unchanged.should == ["SVPullToRefresh"]
state.deleted.should == ["NUI"]
end
#--------------------------------------#
it "updates the repositories by default" do
config.skip_repo_update = false
SourcesManager.expects(:update).once
@analyzer.analyze
end
it "does not updates the repositories if config indicates to skip them" do
config.skip_repo_update = true
SourcesManager.expects(:update).never
@analyzer.analyze
end
#--------------------------------------#
it "generates the libraries which represent the target definitions" do
@analyzer.analyze
libs = @analyzer.libraries
libs.map(&:name).should == ['libPods.a']
lib = libs.first
lib.support_files_root.should == config.sandbox.root
lib.user_project_path.to_s.should.include 'SampleProject/SampleProject'
lib.user_project.class.should == Xcodeproj::Project
lib.user_targets.map(&:name).should == ["SampleProject"]
lib.user_build_configurations.should == {"Test"=>:release, "App Store"=>:release}
lib.platform.to_s.should == 'iOS 6.0'
end
it "generates configures the library appropriately if the installation will not integrate" do
config.integrate_targets = false
@analyzer.analyze
lib = @analyzer.libraries.first
lib.user_project_path.should == config.project_root
lib.user_project.should.be.nil
lib.user_targets.map(&:name).should == []
lib.user_build_configurations.should == {}
lib.platform.to_s.should == 'iOS 6.0'
end
#--------------------------------------#
it "locks the version of the dependencies which did not change in the Podfile" do
@analyzer.analyze
@analyzer.send(:locked_dependencies).map(&:to_s).should == ["SVPullToRefresh"]
end
it "does not lock the dependencies in update mode" do
@analyzer.update_mode = true
@analyzer.analyze
@analyzer.send(:locked_dependencies).map(&:to_s).should == []
end
#--------------------------------------#
it "resolves the dependencies" do
@analyzer.analyze
@analyzer.specifications.map(&:to_s).should == [
"AFNetworking (1.0.1)",
"JSONKit (1.5pre)",
"SVPullToRefresh (0.4)",
"libextobjc/EXTKeyPathCoding (0.2.3)"
]
end
it "adds the specifications to the correspondent libraries in after the resolution" do
@analyzer.analyze
@analyzer.libraries.first.specs.map(&:to_s).should == [
"AFNetworking (1.0.1)",
"JSONKit (1.5pre)",
"SVPullToRefresh (0.4)",
"libextobjc/EXTKeyPathCoding (0.2.3)"
]
end
it "instructs the resolver to not update external sources by default" do
Resolver.any_instance.expects(:update_external_specs=).with(false)
@analyzer.analyze
end
it "instructs the resolver to update external sources if in update mode" do
Resolver.any_instance.expects(:update_external_specs=).with(true)
@analyzer.update_mode = true
@analyzer.analyze
end
it "allow pre downloads in the resolver by default" do
Resolver.any_instance.expects(:allow_pre_downloads=).with(true)
@analyzer.analyze
end
it "allow pre downloads in the resolver by default" do
Resolver.any_instance.expects(:allow_pre_downloads=).with(false)
@analyzer.allow_pre_downloads = false
@analyzer.analyze
end
#--------------------------------------#
it "computes the state of the Sandbox respect to the resolved dependencies" do
@analyzer.stubs(:lockfile).returns(nil)
@analyzer.analyze
state = @analyzer.sandbox_state
state.added.should == ["AFNetworking", "JSONKit", "SVPullToRefresh", "libextobjc"]
end
end
#-------------------------------------------------------------------------#
describe "Private helpers" do
describe "#compute_user_project_targets" do
it "uses the path specified in the target definition while computing the path of the user project" do
target_definition = Podfile::TargetDefinition.new(:default, nil, nil)
target_definition.user_project_path = 'SampleProject/SampleProject'
path = @analyzer.send(:compute_user_project_path, target_definition)
path.to_s.should.include 'SampleProject/SampleProject.xcodeproj'
end
it "raises if the user project of the target definition does not exists while computing the path of the user project" do
target_definition = Podfile::TargetDefinition.new(:default, nil, nil)
target_definition.user_project_path = 'Test'
e = lambda { @analyzer.send(:compute_user_project_path, target_definition) }.should.raise Informative
e.message.should.match /Unable to find/
end
it "if not specified in the target definition if looks if there is only one project" do
target_definition = Podfile::TargetDefinition.new(:default, nil, nil)
config.project_root = config.project_root + 'SampleProject'
path = @analyzer.send(:compute_user_project_path, target_definition)
path.to_s.should.include 'SampleProject/SampleProject.xcodeproj'
end
it "if not specified in the target definition if looks if there is only one project" do
target_definition = Podfile::TargetDefinition.new(:default, nil, nil)
e = lambda { @analyzer.send(:compute_user_project_path, target_definition) }.should.raise Informative
e.message.should.match /Could not.*select.*project/
end
end
#--------------------------------------#
describe "#compute_user_project_targets" do
it "returns the targets specified in the target definition" do
target_definition = Podfile::TargetDefinition.new(:default, nil, nil)
target_definition.link_with = ['UserTarget']
user_project = Xcodeproj::Project.new
user_project.new_target(:application, 'FirstTarget', :ios)
user_project.new_target(:application, 'UserTarget', :ios)
targets = @analyzer.send(:compute_user_project_targets, target_definition, user_project)
targets.map(&:name).should == ['UserTarget']
end
it "raises if it is unable to find the targets specified by the target definition" do
target_definition = Podfile::TargetDefinition.new(:default, nil, nil)
target_definition.link_with = ['UserTarget']
user_project = Xcodeproj::Project.new
e = lambda { @analyzer.send(:compute_user_project_targets, target_definition, user_project) }.should.raise Informative
e.message.should.match /Unable to find the targets/
end
it "returns the target with the same name of the target definition" do
target_definition = Podfile::TargetDefinition.new('UserTarget', nil, nil)
user_project = Xcodeproj::Project.new
user_project.new_target(:application, 'FirstTarget', :ios)
user_project.new_target(:application, 'UserTarget', :ios)
targets = @analyzer.send(:compute_user_project_targets, target_definition, user_project)
targets.map(&:name).should == ['UserTarget']
end
it "raises if the name of the target definition does not match any file" do
target_definition = Podfile::TargetDefinition.new('UserTarget', nil, nil)
user_project = Xcodeproj::Project.new
e = lambda { @analyzer.send(:compute_user_project_targets, target_definition, user_project) }.should.raise Informative
e.message.should.match /Unable to find a target named/
end
it "returns the first target of the project if the target definition is named default" do
target_definition = Podfile::TargetDefinition.new(:default, nil, nil)
user_project = Xcodeproj::Project.new
user_project.new_target(:application, 'FirstTarget', :ios)
user_project.new_target(:application, 'UserTarget', :ios)
targets = @analyzer.send(:compute_user_project_targets, target_definition, user_project)
targets.map(&:name).should == ['FirstTarget']
end
it "raises if the default target definition cannot be linked because there are no user targets" do
target_definition = Podfile::TargetDefinition.new(:default, nil, nil)
user_project = Xcodeproj::Project.new
e = lambda { @analyzer.send(:compute_user_project_targets, target_definition, user_project) }.should.raise Informative
e.message.should.match /Unable to find a target/
end
end
#--------------------------------------#
describe "#compute_user_build_configurations" do
it "returns the user build configurations of the user targets" do
user_project = Xcodeproj::Project.new
target = user_project.new_target(:application, 'Target', :ios)
configuration = user_project.new(Xcodeproj::Project::Object::XCBuildConfiguration)
configuration.name = 'AppStore'
target.build_configuration_list.build_configurations << configuration
target_definition = Podfile::TargetDefinition.new(:default, nil, nil)
user_targets = [target]
configurations = @analyzer.send(:compute_user_build_configurations, target_definition, user_targets)
configurations.should == { 'AppStore' => :release }
end
it "returns the user build configurations specified in the target definition" do
target_definition = Podfile::TargetDefinition.new(:default, nil, nil)
target_definition.build_configurations = { 'AppStore' => :release }
user_targets = []
configurations = @analyzer.send(:compute_user_build_configurations, target_definition, user_targets)
configurations.should == { 'AppStore' => :release }
end
end
#--------------------------------------#
describe "#compute_platform_for_target_definition" do
it "returns the platform specified in the target definition" do
target_definition = Podfile::TargetDefinition.new(:default, nil, nil)
target_definition.platform = Platform.new(:ios, '4.0')
user_targets = []
configurations = @analyzer.send(:compute_platform_for_target_definition, target_definition, user_targets)
configurations.should == Platform.new(:ios, '4.0')
end
it "infers the platform from the user targets" do
user_project = Xcodeproj::Project.new
target = user_project.new_target(:application, 'Target', :ios)
configuration = target.build_configuration_list.build_configurations.first
configuration.build_settings = {
'SDKROOT' => 'iphoneos',
'IPHONEOS_DEPLOYMENT_TARGET' => '4.0'
}
target_definition = Podfile::TargetDefinition.new(:default, nil, nil)
user_targets = [target]
configurations = @analyzer.send(:compute_platform_for_target_definition, target_definition, user_targets)
configurations.should == Platform.new(:ios, '4.0')
end
it "uses the lowest deployment target of the user targets if inferring the platform" do
user_project = Xcodeproj::Project.new
target1 = user_project.new_target(:application, 'Target', :ios)
configuration1 = target1.build_configuration_list.build_configurations.first
configuration1.build_settings = {
'SDKROOT' => 'iphoneos',
'IPHONEOS_DEPLOYMENT_TARGET' => '4.0'
}
target2 = user_project.new_target(:application, 'Target', :ios)
configuration2 = target2.build_configuration_list.build_configurations.first
configuration2.build_settings = {
'SDKROOT' => 'iphoneos',
'IPHONEOS_DEPLOYMENT_TARGET' => '6.0'
}
target_definition = Podfile::TargetDefinition.new(:default, nil, nil)
user_targets = [target1, target2]
configurations = @analyzer.send(:compute_platform_for_target_definition, target_definition, user_targets)
configurations.should == Platform.new(:ios, '4.0')
end
it "raises if the user targets have a different platform" do
user_project = Xcodeproj::Project.new
target1 = user_project.new_target(:application, 'Target', :ios)
configuration1 = target1.build_configuration_list.build_configurations.first
configuration1.build_settings = {
'SDKROOT' => 'iphoneos',
'IPHONEOS_DEPLOYMENT_TARGET' => '4.0'
}
target2 = user_project.new_target(:application, 'Target', :ios)
configuration2 = target2.build_configuration_list.build_configurations.first
configuration2.build_settings = {
'SDKROOT' => 'macosx',
'IPHONEOS_DEPLOYMENT_TARGET' => '10.6'
}
target_definition = Podfile::TargetDefinition.new(:default, nil, nil)
user_targets = [target1, target2]
e = lambda { @analyzer.send(:compute_platform_for_target_definition, target_definition, user_targets) }.should.raise Informative
e.message.should.match /Targets with different platforms/
end
end
#--------------------------------------#
end
end
end
......@@ -29,8 +29,6 @@ end
module Pod
describe Installer do
# before do
# @sandbox = temporary_sandbox
# config.repos_dir = fixture('spec-repos')
......@@ -65,7 +63,8 @@ module Pod
# <<<<<<< HEAD
it "marks all pods as added if there is no lockfile" do
@installer.pods_added_from_the_lockfile.should == ['JSONKit']
true.should.be.true
# @installer.pods_added_from_the_lockfile.should == ['JSONKit']
# =======
# it "adds the files of the pod to the Pods project only once" do
# @installer.install!
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment