← Back to team overview

coapp-developers team mailing list archive

Re: How exactly will CoApp come together?

 

>> Is there a reason to collect all the data using this scantool other then to make a build file?
Yes. Knowledge gained in the scan/trace pass is also used to assist in packaging.

>> Surely it would be less work to just manually setup VS project files, cmake, etc then it would be to build a tool that auto generates them?
Oh, Hell no.
I’ve already done this for over a hundred packages with my gen1 tools. It’s absolutely not easier to manually setup build files.

>> I admit I'm a big fan of CMake and I know CPack exists that makes packages out of your programs.
Yes, it makes “packages”. It might even be able to make packages that conform to CoApp’s narrow specification. That doesn’t necessarily make it faster to do.

>>What I am wondering is if you could save a lot of development time by using cmake
I can’t see how CMake *shortens* development time. It has other benefits to be sure, but it’s not going to accelerate the rest of the process.  The CMake scripts would still have to be created, and hopefully generate project files to the same spec as I’ve been doing. Possible? Yes. Faster? I don’t think so.

>> You would not need to make the ScanTool and mkSpec programs. If you contributed to CPack then you could make it generate an MSI installers and that would mean you wouldn't need to make a mkPackage program either.
If you think making CMake scripts for a lot of projects by hand is simple and fast, I’d recommend you try. Start with the PHP stack. Rebuild the current PHP for Windows build in CMake. If you finish OpenSSL before I build the gen2 tools and get done, we’ll talk.

>> I'd hate to see you put in a lot of development time when the tools you need already exist, or perhaps I'm not understanding the whole picture of package building correctly?
I wouldn’t necessarily say you don’t understand the whole picture, but I’ve spent the last several years working on this specific problem, and the last 20 years making build systems deliver the best results.  It’s going to take a few weeks to explain everything to get the entire model.

Don’t get me wrong, CMake has its benefits, it’s just not accelerating the solution.  It’s more than likely that once the pieces are in place, CMake will be able to be fully supported, but it’s not a concrete requirement in the short run.


Garrett Serack | Open Source Software Developer | Microsoft Corporation
I don't make the software you use; I make the software you use better on Windows.

From: coapp-developers-bounces+garretts=microsoft.com@xxxxxxxxxxxxxxxxxxx [mailto:coapp-developers-bounces+garretts=microsoft.com@xxxxxxxxxxxxxxxxxxx] On Behalf Of Andrew Fenn
Sent: Monday, April 12, 2010 9:48 PM
To: coapp-devel
Subject: Re: [Coapp-developers] How exactly will CoApp come together?

Is there a reason to collect all the data using this scantool other then to make a build file? Surely it would be less work to just manually setup VS project files, cmake, etc then it would be to build a tool that auto generates them? I admit I'm a big fan of CMake and I know CPack exists that makes packages out of your programs.

What I am wondering is if you could save a lot of development time by using cmake? You would not need to make the ScanTool and mkSpec programs. If you contributed to CPack then you could make it generate an MSI installers and that would mean you wouldn't need to make a mkPackage program either.

I'd hate to see you put in a lot of development time when the tools you need already exist, or perhaps I'm not understanding the whole picture of package building correctly?

On Tue, Apr 13, 2010 at 1:23 AM, Garrett Serack <garretts@xxxxxxxxxxxxx<mailto:garretts@xxxxxxxxxxxxx>> wrote:
So,

I’ve been taking questions as to how CoApp packages get built.

Lemme see if I can sketch out the vision for you, so that you get an idea of where it’s going. This isn’t set in stone, but I’ve actually validated this is a workable solution.

Let’s say I want to create a library package for zlib.

First, I’m going to import the zlib source code into a Bazaar in a new CoApp sub-project on Launchpad.

Checking out from there, I’ll first see if the project can be compiled at all using MSVC (any version).  If it has an older project file, I’ll load it up in Visual Studio 10, and let it upgrade the project, and I’ll save it.

Drop back to the command line.

The SCANTOOL file can be pointed to the source directory to scan thru all the source files and build files to generate some intelligence about the project as a whole. It gets a list of all source files (C,C++,.H, etc), potential conditional defines present in the source (#define FOO …) and identifies what additional files are present in the project (for which we’ll have to determine what to do with them (delete, include in final as resources, ???). SCANTOOL dumps all of this data into an XML intelligence file for the project.

Build the project (either by the makefile, the vcprojx file, or whatever means necessary). When doing so however,  use the TRACE tool to watch the library get built. TRACE creates an XML file with every file access, write, read, delete and every command line for the build process and all its child processes.

At this point the developer can create a hand-made intelligence file as well for things that are known about the project (what targets are desired, etc).

The intelligence files and the trace data are fed into another tool MKSPEC, which creates a set of .spec files, each of which describes a binary output desired from the project (a .LIB , .DLL, .EXE, etc) and lists the files needed, conditional #defines, and other options. (this is essentially a compiler-neutral way of representing what is needed to build a particular output)

Each .spec file is then fed into MKPROJECT which will generate a VC10 project file. Plugins for MKPROJECT can trivially build other types of project files for things like VC9, make files for MinGW or CMake files for the CMake faithful. MKProject also ties together a collection of project files into a .SLN file for Visual Studio. Outputs are normalized for naming conventions.

The .SLN file is fed into Visual Studio (or MSBuild, the command line tool) and it compiles up the binaries.  (I’ve got a plan for PGO as well, [profile guided optimization], but I’m going to ignore that right now)

The binaries are fed into a tool called SMARTMANIFEST which creates .manifest and policy files for the library and binds them to any .DLLs and .EXEs created.

The binaries (and manifest data) along with the project source code and build files are fed into MKPACKAGE which uses WiX to build MSI files for each binary, along with a source MSI with just the necessary files to rebuild the binaries (source, vcxproj, sln).

At that point the developer can identify what files can be trimmed from the source tree, and the whole thing can be updated in Bazaar.

http://twitpic.com/rqmo5 -- a flowchart of what I just described. Well, without TRACE.

(there’s a lot more detail to be found, but that’s the gist of it)


[Description: fearthecowboy]<http://fearthecowboy.com/>

Garrett Serack | Microsoft's Open Source Software Developer | Microsoft Corporation
Office:(425)706-7939                                       email/messenger: garretts@xxxxxxxxxxxxx<mailto:garretts@xxxxxxxxxxxxx>
blog: http://fearthecowboy.com<http://fearthecowboy.com/>                                      twitter: @fearthecowboy<http://twitter.com/fearthecowboy>

I don't make the software you use; I make the software you use better on Windows.







_______________________________________________
Mailing list: https://launchpad.net/~coapp-developers
Post to     : coapp-developers@xxxxxxxxxxxxxxxxxxx<mailto:coapp-developers@xxxxxxxxxxxxxxxxxxx>
Unsubscribe : https://launchpad.net/~coapp-developers
More help   : https://help.launchpad.net/ListHelp

GIF image


Follow ups

References