FAQ
Hello!

I have a Catalyst application that I would like to upload from the
development box to the production server. Is there some kind of best
practice to do that? My requirements:

1) The process should take care of the dependencies and run the tests
before installing. (Let?s say the deps are declared in Makefile.PL
or Build.PL.)
2) It would be nice to keep the application isolated in one directory
so that I can keep several instances under the same account to do
primitive staging.

Right now I am updating the application using Git. I push from the
development box to a headless repository on the production server and
there is a hook that updates the working copy. This fails (1).

I?ve read something about local::lib, but I?m still not sure about how
to put things together. This has to be a common scenario, isn?t it? When
you are finished updating the development version, what do you call to
upload the update to the production server and what exactly happens
along the way?

Thank you,

Tom?? Znamen??ek

--
Use what talents you possess: the woods would be very silent
if no birds sang there except those that sang best. ?Henry Van Dyke

Search Discussions

  • Bill Moseley at Mar 30, 2010 at 2:54 pm
    2010/3/30 Tomáš Znamená�ek <tomas.znamenacek@gmail.com>
    I have a Catalyst application that I would like to upload from the
    development box to the production server. Is there some kind of best
    practice to do that? My requirements:
    I don't think there's any standard approach. I know many people seem to
    just do a checkout from the repository.

    1) The process should take care of the dependencies and run the tests
    before installing. (Let’s say the deps are declared in Makefile.PL
    or Build.PL.)
    I have a separate cron job that polls the repository looking for changes.
    When it notices that a new version has
    been checked in it checks it out and runs the full test suite. A big fat
    email goes out if test do not pass. If it passes, but a previous run failed
    an email also goes back congratulating everyone on fixing the problem.

    This works well because the same process can be used on multiple
    applications and is constantly running -- not just when it's crunch time to
    push a release.


    2) It would be nice to keep the application isolated in one directory
    so that I can keep several instances under the same account to do
    primitive staging.
    Agreed.

    I have a separate build process. This is a simple process and doesn't run
    any tests (because the build process may happen
    on a "build server" w/o dependencies needed by the application).

    This simply does an svn export, then it runs build/build_app.sh which then
    runs whatever scripts are needed to build that specific application. For
    example, minify and combine css and javascript, etc.

    Then a tarball is built of this export (named after the application and
    version) and made available on a web server.

    This can then be "pushed" to any server. The push process simply fetches
    the tarball from the build web server, unpacks it into it's version-specific
    directory and runs the Makefile to test for dependencies on the target
    machine. If that passes a symlink is updated to point to this new version
    and the web server is restarted.

    The symlink makes it easy to revert to a previous version or to have
    multiple versions on the same machine for testing.

    The applications have separate YAML files for different environments. There
    might be "dev.yml", "testing.yml", "qa.yml", and "produciton.yml". Each
    machine has a file in /etc/<$app_name> that sets what environment the
    application should start in (i.e. what YAML config file to use). Push to
    testing and the app starts and uses the testing database as configured in
    testing.yml.

    I've used this method for pushing directly to production, but in other cases
    use it for staging and then rsync to production servers from staging.


    I’ve read something about local::lib, but I’m still not sure about how
    to put things together. This has to be a common scenario, isn’t it? When
    you are finished updating the development version, what do you call to
    upload the update to the production server and what exactly happens
    along the way?
    Good question. Hopefully someone has a great solution.

    In the past I've used cfengine to make sure machines have the right
    dependencies.

    I've also used a local lib and rsynced that and messed with @INC and
    PERL5LIB, which doesn't make me happy. (Think about init.d scripts, cron
    jobs, etc.)

    The next approach is to build packages (deb or rpm) of the application and
    dependencies and let the OS package manager handle it all. The goal there,
    besides making it easy for deployment, is to break out common code shared by
    apps into libraries.

    That said, there is something to be said with throwing everything in the
    application's lib directory since you know when it's pushed you will have
    exactly the same code used in development. Lot easier to revert when
    everything is under a a single symlink.

    I've heard of people that build an entire Perl installation and keep that
    separate from the OS installed Perl.


    --
    Bill Moseley
    moseley@hank.org
    -------------- next part --------------
    An HTML attachment was scrubbed...
    URL: http://lists.scsys.co.uk/pipermail/catalyst/attachments/20100330/d5b86dba/attachment.htm
  • Christoph Friedrich at Mar 30, 2010 at 3:06 pm
    Skipped content of type multipart/alternative-------------- next part --------------
    A non-text attachment was scrubbed...
    Name: christoph.vcf
    Type: text/x-vcard
    Size: 154 bytes
    Desc: not available
    Url : http://lists.scsys.co.uk/pipermail/catalyst/attachments/20100330/8cde6ade/christoph.vcf
  • Oleg Kostyuk at Apr 5, 2010 at 9:31 am
    2010/3/30 Bill Moseley <moseley@hank.org>:
    ..........................
    The applications have separate YAML files for different environments. ?There
    might be "dev.yml", "testing.yml", "qa.yml", and "produciton.yml". ?Each
    machine has a file in /etc/<$app_name> that sets what environment the
    application should start in (i.e. what YAML config file to use). ?Push to
    testing and the app starts and uses the testing database as configured in
    testing.yml.
    Catalyst already have such possibility: read end of "DESCRIPTION" in
    Catalyst::Plugin::ConfigLoader, and more details in description of
    get_config_local_suffix().

    --
    Sincerely yours,
    Oleg Kostyuk (CUB-UANIC)
  • Bill Moseley at Apr 5, 2010 at 2:19 pm

    On Mon, Apr 5, 2010 at 2:31 AM, Oleg Kostyuk wrote:

    2010/3/30 Bill Moseley <moseley@hank.org>:
    ..........................
    The applications have separate YAML files for different environments. There
    might be "dev.yml", "testing.yml", "qa.yml", and "produciton.yml". Each
    machine has a file in /etc/<$app_name> that sets what environment the
    application should start in (i.e. what YAML config file to use). Push to
    testing and the app starts and uses the testing database as configured in
    testing.yml.
    Catalyst already have such possibility: read end of "DESCRIPTION" in
    Catalyst::Plugin::ConfigLoader, and more details in description of
    get_config_local_suffix().
    Yes, similar. ConfigLoader didn't exist when I wrote the above, and it's
    not a plugin (the Plugin is just a thin wrapper) -- it's available outside
    of Catalyst which is very useful. Wasn't there talk a while back about
    splitting up ConfigLoader to make it available outside of Catalyst and with
    the ability to merge files?



    --
    Bill Moseley
    moseley@hank.org
    -------------- next part --------------
    An HTML attachment was scrubbed...
    URL: http://lists.scsys.co.uk/pipermail/catalyst/attachments/20100405/4e0448e5/attachment.htm
  • Tomas Doran at Apr 7, 2010 at 12:37 am

    On 5 Apr 2010, at 15:19, Bill Moseley wrote:

    Wasn't there talk a while back about splitting up ConfigLoader to
    make it available outside of Catalyst and with the ability to merge
    files?
    Yes.

    Nobody actually did anything useful to help make it happen yet
    however :(

    Cheers
    t0m
  • Toby Corkindale at Apr 1, 2010 at 7:51 am

    On 30/03/10 19:32, Tom?? Znamen??ek wrote:
    Hello!

    I have a Catalyst application that I would like to upload from the
    development box to the production server. Is there some kind of best
    practice to do that? My requirements:

    1) The process should take care of the dependencies and run the tests
    before installing. (Let?s say the deps are declared in Makefile.PL
    or Build.PL.)
    2) It would be nice to keep the application isolated in one directory
    so that I can keep several instances under the same account to do
    primitive staging.

    Right now I am updating the application using Git. I push from the
    development box to a headless repository on the production server and
    there is a hook that updates the working copy. This fails (1).

    I?ve read something about local::lib, but I?m still not sure about how
    to put things together. This has to be a common scenario, isn?t it? When
    you are finished updating the development version, what do you call to
    upload the update to the production server and what exactly happens
    along the way?
    We package things up into Debian-style packages, and then upload those
    to a local repository of packages.
    Then servers can just be updated using the standard system tools (apt).
    This works quite well.

    You have a choice of either packaging up every single Perl dependency
    into a Debian package too (which is a world of pain), or installing all
    your dependencies into a local directory that you ship with the
    application. I recommend the latter.. (you'll still need to include
    dependencies on things like the C libraries for your database client,
    etc though, in the debian control file.)
  • Bill Moseley at Apr 9, 2010 at 1:11 pm

    On Thu, Apr 1, 2010 at 12:51 AM, Toby Corkindale wrote:
    We package things up into Debian-style packages, and then upload those to a
    local repository of packages.
    Then servers can just be updated using the standard system tools (apt).
    Hi Toby,

    This is really the direction I'm heading now (although it's looking like
    CentOS and RPMs). Can you answer a few general questions?

    Are you using Template Toolkit? How (or really where) are the templates
    managed? Where do they get installed, how does the TT View know where to
    find them, etc? Do they end up in /usr/share/<app>/ for example?

    I'm sure you never have to roll-back a release, but I also assume you are
    prepared to roll-back if needed. How does that process work?

    What about your static content (css, js, images)? Where do those get
    installed?

    Any special tricks when using an app in "development" vs. production? (For
    example, under "dev" I use source css, js, but otherwise the app uses
    combined and compresses css and js.


    You have a choice of either packaging up every single Perl dependency into
    a Debian package too (which is a world of pain), or installing all your
    dependencies into a local directory that you ship with the application. I
    recommend the latter.. (you'll still need to include dependencies on things
    like the C libraries for your database client, etc though, in the debian
    control file.)

    We are doing a mix. But, for the most part we are creating single modules
    (packages). Mostly that was to encourage inclusions of unit tests and just
    more fine-grained management. But, it is more work, true.


    --
    Bill Moseley
    moseley@hank.org
    -------------- next part --------------
    An HTML attachment was scrubbed...
    URL: http://lists.scsys.co.uk/pipermail/catalyst/attachments/20100409/8d309d79/attachment.htm
  • Bogdan Lucaciu at Apr 9, 2010 at 7:10 pm

    On Fri, Apr 9, 2010 at 4:11 PM, Bill Moseley wrote:

    On Thu, Apr 1, 2010 at 12:51 AM, Toby Corkindale
    wrote:
    We package things up into Debian-style packages, and then upload those to
    a local repository of packages.
    Then servers can just be updated using the standard system tools (apt).
    Hi Toby,
    This is really the direction I'm heading now (although it's looking like
    CentOS and RPMs). ?Can you answer a few general questions?
    Are you using Template Toolkit? ?How (or really where) are the templates
    managed? ? Where do they get installed, how does the TT View know where to
    find them, etc? ?Do they end up in /usr/share/<app>/ for example?
    I'm sure you never have to roll-back a release, but I also assume you are
    prepared to roll-back if needed. ?How does that process work?
    What about your static content (css, js, images)? ?Where do those get
    installed?
    Considering a Catalyst app is laid out like any standard Perl
    distribution, using dh-make-perl will generate a pretty standard
    debian package, so all the stuff in script/ goes to /usr/bin/, and all
    the other files go in /usr/share/perl5/Dist/Name.

    To properly include the templates I would just use something like:
    <View::TT>
    INCLUDE_PATH = __path_to(root)__
    ...

    or similar.
    The static content lives in /usr/share/perl5/Dist/Name/root/static, if
    you use a caching reverse proxy (like varnish) you can just let
    Static::Simple serve them, otherwise just point your web server's
    /static location to that dir.

    Also, all the stuff produced by your application (uploaded file,
    whatever) should go in the correct paths (as per FHS), like
    /var/lib/app-name/ , /var/cache/app-name etc

    About rollback, it's as simple as installing the old version, all is
    replaced, but I don't know how you'd handle database schema changes (I
    use Schema::Versioned a lot, it can probably handle the rollback, but
    didn't try it so far)
    Any special tricks when using an app in "development" vs. production? ?(For
    example, under "dev" I use source css, js, but otherwise the app uses
    combined and compresses css and js.
    Handle all this logic from config files and env variables.
    Take a look at http://www.catalystframework.org/calendar/2009/11

    I would argue on keeping outside of the external config files
    everything that is NOT related to configuration/deployment but is
    INTERNAL to your application. For instance all the View::TT config
    could very well be defined in TT.pm and kept in your code repo, while
    the Model::DB DSN should be in the config file.
  • Peter Karman at Apr 10, 2010 at 2:36 am

    Bogdan Lucaciu wrote on 4/9/10 2:10 PM:
    To properly include the templates I would just use something like:
    <View::TT>
    INCLUDE_PATH = __path_to(root)__
    ...

    or similar.
    The static content lives in /usr/share/perl5/Dist/Name/root/static, if
    you use a caching reverse proxy (like varnish) you can just let
    Static::Simple serve them, otherwise just point your web server's
    /static location to that dir.
    See also Catalyst::Plugin::Static::Simple::ByClass

    --
    Peter Karman . http://peknet.com/ . peter@peknet.com
  • Toby Corkindale at Apr 13, 2010 at 5:40 am

    On 09/04/10 23:11, Bill Moseley wrote:
    On Thu, Apr 1, 2010 at 12:51 AM, Toby Corkindale

    We package things up into Debian-style packages, and then upload
    those to a local repository of packages.
    Then servers can just be updated using the standard system tools (apt).

    Hi Toby,

    This is really the direction I'm heading now (although it's looking like
    CentOS and RPMs). Can you answer a few general questions?

    Are you using Template Toolkit? How (or really where) are the templates
    managed? Where do they get installed, how does the TT View know where
    to find them, etc? Do they end up in /usr/share/<app>/ for example?
    Yes, I'm using Template Toolkit, although due to the
    apparently-unfixable crashes in the XS stash, I've also built some
    packaged with Template Alloy too.

    I just put my templates into the 'root' directory, as per the Catalyst
    standard layout. After installation, they end up under your distro's
    Perl directory, in site_perl or vendor_perl, under a 'root' directory in
    your Module's namespace.

    Eg. if you have MyApp.pm, then your templates end up in
    ..../site_perl/5.10.1/MyApp/root/
    I'm sure you never have to roll-back a release, but I also assume you
    are prepared to roll-back if needed. How does that process work?
    If you're using the Debian tools, then you can specify a version number
    when giving a package to "upgrade", which can also be used to downgrade.
    (This requires you to configure your company's local .deb package
    repository to hang on to N many old versions; how many for N is up to you.)

    The debian tools seem really quite good at noticing if you've, say, made
    changes to the local configuration file for your app, but that there are
    also changes to it coming down in the new version, and it'll prompt you
    about this.

    It's worth noting that by default, the debian package tools will put
    your myapp.conf into site_perl/5.10.1/MyApp/ as well.. I dislike this,
    and so over-ride the debian/rules file to move it into /etc/ where it
    makes more sense.
    What about your static content (css, js, images)? Where do those get
    installed?
    As above, under site_perl; however you can override this in the
    debian/rules files to put it in /var/www/ or somesuch; I'm lazy and tend
    to just use Static::Simple; if you have a reverse proxy in front of your
    app (as you should if performance is a concern) then you can just cache
    the static stuff there instead.

    Any special tricks when using an app in "development" vs. production?
    (For example, under "dev" I use source css, js, but otherwise the app
    uses combined and compresses css and js.
    When in development, I run it on a different server altogether, and do
    not have it installed into the global perl path at all. And I run it
    with the "myapp/script/myapp_server.pl" rather than via a standalone
    webserver+appserver(+ optional proxy) stack.

    For your example, I would put the command to combine-and-compress the
    CSS and JS into the debian/rules file.

    However you need a staging server which mirrors the production
    environment and stack in order to properly test it prior to release.

    You have a choice of either packaging up every single Perl
    dependency into a Debian package too (which is a world of pain), or
    installing all your dependencies into a local directory that you
    ship with the application. I recommend the latter.. (you'll still
    need to include dependencies on things like the C libraries for your
    database client, etc though, in the debian control file.)


    We are doing a mix. But, for the most part we are creating single
    modules (packages). Mostly that was to encourage inclusions of unit
    tests and just more fine-grained management. But, it is more work, true.

    I disliked having to use the relatively primitive and time-consuming
    Debian/Gentoo/RedHat tools to manage CPAN modules, when CPANPLUS exists.
    Why use a plastic trowel when you have a pneumatic digger available? :)

    I should point out that this does then require keeping the entire
    installed Perl tree in source control though, so that one can tag
    exactly which modules were used (and bundled with) an application.


    Toby

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcatalyst @
categoriescatalyst, perl
postedMar 30, '10 at 8:32a
activeApr 13, '10 at 5:40a
posts11
users8
websitecatalystframework.org
irc#catalyst

People

Translate

site design / logo © 2021 Grokbase