Greetings / Project Goals - Minix

This is a discussion on Greetings / Project Goals - Minix ; On Aug 29, 10:48 pm, João Jerónimo wrote: > Daniel Carrera escreveu: > > packman install foo # Same as "apt-get install" > > packman remove foo # Same as "apt-get remove" > > packman build foo # Wrapper around ...

+ Reply to Thread
Page 2 of 3 FirstFirst 1 2 3 LastLast
Results 21 to 40 of 42

Thread: Greetings / Project Goals

  1. Re: Greetings / Project Goals

    On Aug 29, 10:48 pm, João Jerónimo
    wrote:
    > Daniel Carrera escreveu:
    > > packman install foo # Same as "apt-get install"
    > > packman remove foo # Same as "apt-get remove"
    > > packman build foo # Wrapper around apt-build
    > > packman search foo # Wrapper around apt-cache

    >
    > packman fetch_patch
    >
    > Which runs apt-build in such a way that the package is downloaded and
    > patched to correctly compile for Minix, but lets the user manually issue
    > the commands to compile it.


    "apt-get source" will download the sources for a package to the
    current directory. Apt-build is just a perl script that wraps around
    apt-get. You can get the source here:

    http://packages.debian.org/sid/apt-build

    apt-build is quite comprehensible. If you look at the install()
    procedure you'll see how it works. It would be easy to add a fetch()
    procedure that downloads the source and installs all the build
    dependencies but does not actually install: Just make a copy of the
    install() subroutine and remove the last if statement.


    > And about patches: I don't know how Debian does so, but if we published the
    > patches instead of the modified packages, and had the package manager apply
    > them for us (I guess apt-build can be modified to support this, if it
    > doesn't already support), we could save much disk space at the server side.


    Debian keeps the actual packages. If you run "apt-get source" you get
    three files:

    foo.orig.tar.gz # Original file.
    foo.diff.tar.gz # Debian's patches.
    foo.dsc # Package manager metadata.


    In any event, I'm not sure about that your idea would work well. Sure,
    it should be easy to modify apt-build to download and apply an extra
    patch from minix3.org. But I see a few problems:

    1) We are forced to keep the same set of dependencies as a package
    built for a different system. Those might not always work.

    2) The package might recommend a package that is not yet ported to
    Minix.

    3) We still have to keep a server with the binary packages. So we
    wouldn't be saving all that much.


    If we are concerned about the server workload, and the Minix team
    decides that they do like the Debian system, I can suggest another
    idea: We could ask Debian to let us make a Debian/Minix project.
    Debian has a lot of experience making ports to other kernels. There is
    a Debian/BSD, Debian/Hurd and even a Debian/Win32 project. If Debian
    accepts the project, they would give us all the server space we need.

    A consequence of making a Debian project is that we'd no longer have
    the freedom to fork apt-build to make it do things differently. But we
    could try asking the apt-build team for the changes we want.

    On the other hand, maybe Debian won't mind if the Debian/Minix por
    also has a script called packman that wraps around apt and does its
    own thing.

    Just some thoughts.

    Daniel.

  2. Re: Greetings / Project Goals

    On Aug 29, 10:36*pm, João Jerónimo
    wrote:
    > This is not even half flexible than USE flags, that are applied by
    > the .ebuild and may include anything (like ./configure options, extra
    > patches, make arguments, extra dependencies, etc).


    After much searching and poking through source code I found that apt-
    build has a config file (apt-build.conf) that includes some of the
    things you say:

    Olevel = -O3
    march = -march=i686
    mcpu = -mcpu=athlon-xp
    options = " "
    make_options = " "

    http://ubuntuforums.org/showthread.php?t=759481

    But I have not found any good documentation for apt-build.conf, so I
    don't know what's possible. I'm sure it does not support "use kde".

    Daniel.

  3. Re: Greetings / Project Goals

    Daniel Carrera escreveu:

    > That's quite a lot of power packed inside USE flags. I just scanned
    > the Gentoo doc so I have an idea of what we're talking about. I think
    > we can emulate some but not all of the features you say.
    > we could have a config file with the options:
    > (...)
    > You get the idea.


    USE flags are an abstraction over compile-time options. For example, if I
    unset the "doc" USE flag (I think it's set by default), the documentation
    will be omited in package instalation. The apropriate flag will be passed
    to the configure script (be it --with-no-docs, --no-docs, --omit-docs
    or --throw-docs-away) or to whichever build system it uses.

    What I meant when I mentioned dependencies was: suppose you compile GNU
    Emacs with the "X" USE flag set. Then, Xorg will be a dependency of Emacs.
    But if you don't want the X interface for Emacs, you unset "X" (you can do
    this package-wise or system-wide, as you like) and Emacs no longer depends
    on Xorg being installed.

    They can do this because the ebuilds are (kindof) scripts. These scripts
    declare some functions (src_unpack, src_compile, src_install, etc) which do
    the real job.

    > But I don't see how we can add extra dependencies or patches or "use
    > kde" without extending the .deb format. And I don't want to be
    > extending any format, even if the format allows extension.


    I understand, but patching sources is pretty obvious... I mean, only someone
    who doesn't at all care wasting tons of disk space duplicating data forgets
    to take advantage of patches. :-) And don't forget source code size grows
    in general assimptotically faster than binary size (at least, I believe so;
    never tested)...

    The .ebuilds get the packages from the official servers of the corresponding
    package and apply any needed patches locally. The patches are supplied with
    the portage tree itself, and are updated when the user synchronizes the
    tree.

    --
    João Jerónimo

    "Computer are composed of software, hardware, and other stuff terminated
    in "ware", like firmware, tupperware, (...)" - by JJ.

  4. Re: Greetings / Project Goals

    Daniel Carrera escreveu:

    > On Aug 28, 1:10Â*am, Rui Maciel wrote:
    >> Nonetheless, the process is a pain in the neck when compared with the
    >> regular sudo apt-get install.

    >
    > Sorry to reply again to the same post. Are you sure apt-build is a
    > pain? I've never used it, so I looked it up and it looks simple
    > enough:
    >
    > http://polishlinux.org/linux/debian/...timize-debian/


    "Sorry, but apt-build cannot manage USE flags."

    > On the topic of USE flags, one of the posters suggests editing /etc/
    > make.conf instead.


    This is not even half flexible than USE flags, that are applied by
    the .ebuild and may include anything (like ./configure options, extra
    patches, make arguments, extra dependencies, etc).

    Quote from the article:
    > Anyway, don’t worry. Debian wasn’t created to be
    > compiled by users, contrary to Gentoo.


    Each distribution has it's own target. I only hoped Gentoo had better
    support for binaries. I don't always believe in all-in-one solutions, but
    with carefull engeneering it's possible to achieve some of the benefits
    of "the other world" in order to get an overall better system...

    --
    João Jerónimo

    "Computer are composed of software, hardware, and other stuff terminated
    in "ware", like firmware, tupperware, (...)" - by JJ.

  5. Re: Greetings / Project Goals

    Daniel Carrera escreveu:

    > Then we can probably find a way to get an equivalent to Getoo's "USE"
    > if that's what people want.


    I would like if something like that is proven to be possible, even if not
    *so* flexible than Gentoo's.

    --
    João Jerónimo

    "Computer are composed of software, hardware, and other stuff terminated
    in "ware", like firmware, tupperware, (...)" - by JJ.

  6. Re: Greetings / Project Goals

    Daniel Carrera escreveu:

    > On Aug 28, 11:35Â*am, Daniel Carrera wrote:
    >> packman install foo Â*# Installs foo as a binary.
    >> packman install foo --source # Uses apt-build to compile sources.

    >
    > Here is a better idea: Replace "packamn install --sources" by "packman
    > build". So the new packman would look like this:
    >
    > packman install foo # Same as "apt-get install"
    > packman remove foo # Same as "apt-get remove"
    > packman build foo # Wrapper around apt-build
    > packman search foo # Wrapper around apt-cache


    packman fetch_patch

    Which runs apt-build in such a way that the package is downloaded and
    patched to correctly compile for Minix, but lets the user manually issue
    the commands to compile it.

    And about patches: I don't know how Debian does so, but if we published the
    patches instead of the modified packages, and had the package manager apply
    them for us (I guess apt-build can be modified to support this, if it
    doesn't already support), we could save much disk space at the server side.

    --
    João Jerónimo

    "Computer are composed of software, hardware, and other stuff terminated
    in "ware", like firmware, tupperware, (...)" - by JJ.

  7. Re: Greetings / Project Goals

    Daniel Carrera escreveu:

    > 1) We are forced to keep the same set of dependencies as a package
    > built for a different system. Those might not always work.
    >
    > 2) The package might recommend a package that is not yet ported to
    > Minix.


    I did not understand you.
    Packages need to be ported to work on Minix. This means that, for building
    from source, either:
    1 - apt-build downloads the original package and applies some patches
    containing the modifications needed to have the package compile correctly
    on Minix
    2 - or apt-build downloads a pre-modified package, ready to be compiled on
    Minix

    Either of the approaches work and require the some set of dependencies for a
    given package. However, approach 2 takes much more disk space than approach
    1.

    > 3) We still have to keep a server with the binary packages. So we
    > wouldn't be saving all that much.


    I thought binary packages tended to spend much less space than source
    packages, but now I compiled Python on my machine and I was wrong. :-)
    But parhaps Python was not the best idea, cause even a freshly compiled
    version comes with a lot of source code which is intended to be
    interpreted.
    AMOF, I've just compiled GCC-4.2.2, and the binary result takes 7MB less
    than the original package's size (after being .tar.bz2'ed). The original
    source was 19M and the binary is 12M.

    Anyway, that way we would spend space only for the binaries, but this time
    this would not be so important, cause disk space is currently quite
    unexpensive. It would only compensate if the gains were big (like sources
    taking twice the space binaries took).

    > If we are concerned about the server workload, and the Minix team
    > decides that they do like the Debian system, I can suggest another
    > idea: We could ask Debian to let us make a Debian/Minix project.
    > Debian has a lot of experience making ports to other kernels. There is
    > a Debian/BSD, Debian/Hurd


    It's a possible solution.

    > and even a Debian/Win32 project.


    ????
    Afaik, it's only a Debian installer for windows. This can be done with a
    partition editing library, an ext3 IFS driver and some black magic; there's
    nothing special about it.

    --
    João Jerónimo

    "Computer are composed of software, hardware, and other stuff terminated
    in "ware", like firmware, tupperware, (...)" - by JJ.

  8. Re: Greetings / Project Goals

    On Aug 29, 10:53*pm, João Jerónimo
    wrote:
    > Daniel Carrera escreveu:
    >
    > > 1) We are forced to keep the same set of dependencies as a package
    > > built for a different system. Those might not always work.

    >
    > > 2) The package might recommend a package that is not yet ported to
    > > Minix.

    >
    > I did not understand you.
    > Packages need to be ported to work on Minix. This means that, for building
    > from source, either:
    > *1 - apt-build downloads the original package and applies some patches
    > containing the modifications needed to have the package compile correctly
    > on Minix
    > *2 - or apt-build downloads a pre-modified package, ready to be compiled on
    > Minix



    Before doing apt-build also downloads dependencies and build
    dependencies. For example, it might try to install glibc. That's why I
    don't think we can just grab the Linux packages from Debian and patch
    them. We do have to make our own packages so we get the dependencies
    right.

    Note: I am not an expert on apt or dpkg. I've made a couple of very
    simple type packages and that's it. Anything I say could be wrong.


    > > 3) We still have to keep a server with the binary packages. So we
    > > wouldn't be saving all that much.

    >
    > I thought binary packages tended to spend much less space than source
    > packages,


    True enough. And I just thought of another idea: Maybe we can get a
    set of donated mirrors like Debian has. Maybe through sourceforge for
    example.

    > > and even a Debian/Win32 project.

    >
    > ????
    > Afaik, it's only a Debian installer for windows. This can be done with a
    > partition editing library, an ext3 IFS driver and some black magic; there's
    > nothing special about it.


    Ok. I just saw the mailing list name.

    Daniel.

  9. Re: Greetings / Project Goals

    On Aug 29, 11:34*pm, Daniel Carrera wrote:
    > Before doing apt-build also downloads dependencies and build
    > dependencies.


    That line was supposed to say:

    Before doing (1)/(2) apt-build also downloads dependencies...


    Before compiling the patched sources apt-build also installs all the
    dependencies needed to build the package. Then it compiles the
    sources, makes a new .deb with the just-compiled binaries and gives
    that .deb to dpkg for installing.

    Some of the Linux packages will depend on glibc. The Minix version of
    those packages should depend on Minix's C library instead. If you want
    the package manager to know that you installed the program you have to
    make a .deb. That .deb needs to have the right dependencies for Minix.

    Daniel.

  10. Re: Greetings / Project Goals

    Daniel Carrera escreveu:

    >> I did not understand you.
    >> Packages need to be ported to work on Minix. This means that, for
    >> building from source, either:
    >> 1 - apt-build downloads the original package and applies some patches
    >> containing the modifications needed to have the package compile correctly
    >> on Minix
    >> 2 - or apt-build downloads a pre-modified package, ready to be compiled
    >> on Minix

    >
    > Before doing apt-build also downloads dependencies and build
    > dependencies. For example, it might try to install glibc. That's why I
    > don't think we can just grab the Linux packages from Debian and patch
    > them. We do have to make our own packages so we get the dependencies
    > right.


    Yes. My proposal was to download the sources from the oficial sites (for
    example, from ftp.mozilla.org), extract, apply the patches to adapt the
    original released sources so that they compile on Minix, and finally
    compile+install them.

    --
    João Jerónimo

    "Computer are composed of software, hardware, and other stuff terminated
    in "ware", like firmware, tupperware, (...)" - by JJ.

  11. Re: Greetings / Project Goals

    Daniel Carrera escreveu:

    > Some of the Linux packages will depend on glibc. The Minix version of
    > those packages should depend on Minix's C library instead. If you want
    > the package manager to know that you installed the program you have to
    > make a .deb. That .deb needs to have the right dependencies for Minix.


    Sorry, but apt-build downloads the sources in .deb? Doesn't make much
    sense...

    --
    João Jerónimo

    "Computer are composed of software, hardware, and other stuff terminated
    in "ware", like firmware, tupperware, (...)" - by JJ.

  12. Re: Greetings / Project Goals

    On Aug 31, 1:22*am, João Jerónimo
    wrote:
    > Daniel Carrera escreveu:
    >
    > > Some of the Linux packages will depend on glibc. The Minix version of
    > > those packages should depend on Minix's C library instead. If you want
    > > the package manager to know that you installed the program you have to
    > > make a .deb. That .deb needs to have the right dependencies for Minix.

    >
    > Sorry, but apt-build downloads the sources in .deb? Doesn't make much
    > sense...


    That is not what I meant to say. I do not know in which format it
    downloads them but I assume it is .tar.gz. What I meant is that apt-
    build performs the following actions:

    1. Install dependencies.
    2. Download sources and apply Debian's patch.
    3. Compile sources.
    4. Create a .deb locally out of the binaries you just compiled.
    5. Give the .deb to dpkg.

    The reason for doing (4) and (5) instead of running "make install" is
    that this way the package is registered with the package manager, with
    all the benefits that implies.

    Daniel.

  13. Re: Greetings / Project Goals

    On Aug 31, 1:21*am, João Jerónimo
    wrote:
    > Daniel Carrera escreveu:
    > > Before doing apt-build also downloads dependencies and build
    > > dependencies. For example, it might try to install glibc. That's why I
    > > don't think we can just grab the Linux packages from Debian and patch
    > > them. We do have to make our own packages so we get the dependencies
    > > right.

    >
    > Yes. My proposal was to download the sources from the oficial sites (for
    > example, from ftp.mozilla.org), extract, apply the patches to adapt the
    > original released sources so that they compile on Minix, and finally
    > compile+install them.


    apt-build does not download the sources from (for example)
    ftp.mozilla.org, it gets them from (for example) ftp.debian.org. If
    you think about it, with a few thousand packages in the system, you
    can't take the risk of some of them changing their website or moving
    their files and breaking all your users.

    I understand the theory of not wanting to keep full sources for
    firefox on minix3.org but:
    (1) I don't think getting sources from each project separately will
    work well, and
    (2) This is not the way APT works.

    There are way swe can get free server space. Sourceforge will gladly
    accept a Minix 3 repository project, and then we'd have lots of
    mirrors and the like. This is what Debian does (I mean mirrors, I
    don't know if they use SF). I think it's a lot better to get an
    account with sourceforge than it is to fork APT.

    Daniel.

  14. Re: Greetings / Project Goals

    Daniel Carrera escreveu:

    > 4. Create a .deb locally out of the binaries you just compiled.
    > 5. Give the .deb to dpkg.
    >
    > The reason for doing (4) and (5) instead of running "make install" is
    > that this way the package is registered with the package manager, with
    > all the benefits that implies.


    Ok.

    --
    João Jerónimo

    "Computer are composed of software, hardware, and other stuff terminated
    in "ware", like firmware, tupperware, (...)" - by JJ.

  15. Re: Greetings / Project Goals

    Daniel Carrera escreveu:

    > On Aug 31, 1:21Â*am, João Jerónimo
    > wrote:
    >> Daniel Carrera escreveu:
    >> > Before doing apt-build also downloads dependencies and build
    >> > dependencies. For example, it might try to install glibc. That's why I
    >> > don't think we can just grab the Linux packages from Debian and patch
    >> > them. We do have to make our own packages so we get the dependencies
    >> > right.

    >>
    >> Yes. My proposal was to download the sources from the oficial sites (for
    >> example, from ftp.mozilla.org), extract, apply the patches to adapt the
    >> original released sources so that they compile on Minix, and finally
    >> compile+install them.

    >
    > apt-build does not download the sources from (for example)
    > ftp.mozilla.org, it gets them from (for example) ftp.debian.org. If
    > you think about it, with a few thousand packages in the system, you
    > can't take the risk of some of them changing their website or moving
    > their files and breaking all your users.


    This apparent problem can be solved by providing regular updates for
    the "package tree". Gentoo does this using rsync.

    --
    João Jerónimo

    "Computer are composed of software, hardware, and other stuff terminated
    in "ware", like firmware, tupperware, (...)" - by JJ.

  16. Re: Greetings / Project Goals

    On Aug 31, 1:50*am, João Jerónimo
    wrote:
    > This apparent problem can be solved by providing regular updates for
    > the "package tree". Gentoo does this using rsync.


    So users have to run some sort of update program every once in a while
    and it updates the URLs for all the packages? What hapens if a package
    is removed entirely?

    I can't guess how easy or hard it might be to emulate that feature
    using APT. But as I said earlier, we could just get a mirror from
    sourceroge if that's the issue. I suspect that Gentoo uses
    ftp.mozilla.org because Gentoo users usually want to have the latest
    bleeding edge version of every software package.

    Daniel.

  17. Re: Greetings / Project Goals

    Daniel Carrera escreveu:

    > On Aug 31, 1:50Â*am, João Jerónimo
    > wrote:
    >> This apparent problem can be solved by providing regular updates for
    >> the "package tree". Gentoo does this using rsync.

    >
    > So users have to run some sort of update program every once in a while
    > and it updates the URLs for all the packages?


    Only if they actually change, of course...

    > What hapens if a package is removed entirely?


    Old packages usually remain in the ftp servers for much time, and for very
    old packages the user would better update to a newest version anyway.

    Actually, fetching problems make only about 5% of total instalation failures
    in Gentoo (unless your internet connection is down :-) ).

    > I can't guess how easy or hard it might be to emulate that feature
    > using APT. But as I said earlier, we could just get a mirror from
    > sourceroge if that's the issue. I suspect that Gentoo uses
    > ftp.mozilla.org because Gentoo users usually want to have the latest
    > bleeding edge version of every software package.


    That's wrong. Portage only installs unstable and/or testing apps if the user
    asks so. Personally, I usually don't install them. The only exception is if
    I need or want to test some unstable feature or software (for example,
    KDE4).
    Packages that are still being tested are masked so that they won't install
    on architectures(s) where it's known/expected that they won't work.
    Additionally, packages who are expected not to compile at all because they
    have serious problems can be "hard masked".

    Of course, the user can revert all this, but only if (s)he wants. For
    example, I can tell Portage that's ok to install packages that are unstable
    on x86, or tell portage to update to unstable versions of gtk+.
    I can revert the hard-masking of a package if I want, too.

    --
    João Jerónimo

    "Computer are composed of software, hardware, and other stuff terminated
    in "ware", like firmware, tupperware, (...)" - by JJ.

  18. Re: Greetings / Project Goals

    On Aug 31, 5:39*pm, João Jerónimo
    wrote:
    > > What hapens if a package is removed entirely?

    >
    > Old packages usually remain in the ftp servers for much time, and for very
    > old packages the user would better update to a newest version anyway.


    Honestly, I don't like those answers. It might be fine for a hobby
    computer, but I woudn't use something like that ona production server
    for example. I appreciate robustness. This is one of the things that I
    find appealing about Minix 3. The Gentoo way doesn't strike me as
    robust.

    Nothing wrong with Gentoo of course. As you said, each distribution is
    designed with different goals. As I understand it, Gentoo is a
    bleeding-edge oriented distribution, so your answer is perfectly valid
    in that context. But for example you wouldn't see Debian making that
    kind of comment.

    > > I suspect that Gentoo uses
    > > ftp.mozilla.org because Gentoo users usually want to have the latest
    > > bleeding edge version of every software package.

    >
    > That's wrong. Portage only installs unstable and/or testing apps if the user
    > asks so.


    When I say bleeding edge I don't mean "unstable". But as I understand
    it, Gentoo aims to always have available the latest release of every
    software package. I hear that Gentoo prides itself in often having an
    ebuild ready the day after a new release of an application.

    Daniel.

  19. Re: Greetings / Project Goals

    Daniel Carrera wrote:
    > But the way I read the documentation for source install, it essetially
    > says to download the tar ball yoruself and then run make && make


    Where does it say so? Please have a look at

    http://www.netbsd.org/docs/pkgsrc/us...ld-and-install

    where it says:

    For example, type
    % cd misc/figlet
    % make

    at the shell prompt to build the various components of the package.

    The next stage is to actually install the newly compiled program onto
    your system. Do this by entering:
    % make install

    while you are still in the directory for whatever package you are
    installing.

    --
    Saludos,
    Angel

    O< ascii ribbon campaign - stop html mail and posts - www.asciiribbon.org

  20. Re: Greetings / Project Goals

    Daniel Carrera escreveu:

    >>> What hapens if a package is removed entirely?

    >> Old packages usually remain in the ftp servers for much time, and for
    >> very old packages the user would better update to a newest version
    >> anyway.

    >
    > Honestly, I don't like those answers. It might be fine for a hobby
    > computer, but I woudn't use something like that ona production server
    > for example.


    I don't see any advantage of using five-year-old software when a couple of
    much better versions have came out since then. Also, as I mentioned, _old
    packages usually remain in the ftp servers for very long time_.

    It's however possible that I only say this because I don't usually
    administrate servers... Actually, once I run I Gentoo-based server and had
    the update command on cron. One time, an inocent grub update rendered the
    system unbootable, and I lost one entire day trying to find where the
    problem was... It happened that the configuration file's name has
    changed... :-)

    > I appreciate robustness. This is one of the things that I
    > find appealing about Minix 3. The Gentoo way doesn't strike me as
    > robust.


    Portage can be as useful to build a robust system as it is to build a
    bleeding edge system (as you call it). I remembered now that the masking
    system can be used to force the instalation to stick with similar versions
    forever.

    The architecture-wise masking is made of several "keywords". Those keywords
    are "x86" or "amd64" (for example) for stable packages, and "~x86"
    or "~amd64" for unstable packages. When I want an unstable version of gtk+,
    I ask for gtk+ to be updated with the ~x86 keyword enabled. This way,
    Portage will see that a new version is available, and that that version is
    allowed according to current keywords.

    I think that, instead, these keywords could be transformed into something
    like "x86_1.4.2" (fictuous version number) for someone who wants to stick
    with packages originally intended for version 1.4.2 on x86 hosts. That way,
    security fixes could still be included as new, equally-robust versions were
    released (but without jumping, e.g., from kde3 to kde4 unless the user
    explicitly asks so).

    > Nothing wrong with Gentoo of course. As you said, each distribution is
    > designed with different goals. As I understand it, Gentoo is a
    > bleeding-edge oriented distribution, so your answer is perfectly valid
    > in that context. But for example you wouldn't see Debian making that
    > kind of comment.


    I don't see it as bleeding edge, but rather as keeping updated.
    Normal distributions usually don't have any automated method for doing heavy
    updates. They only ship security fixes and so...
    This is a limitation, because you can't keep updated unless you "break the
    rules". However, I understand that the strength of software often resides
    on it's own limitations.

    > When I say bleeding edge I don't mean "unstable". But as I understand
    > it, Gentoo aims to always have available the latest release of every
    > software package. I hear that Gentoo prides itself in often having an
    > ebuild ready the day after a new release of an application.


    Those ebuilds are for those who really want them.

    --
    João Jerónimo

    "Computer are composed of software, hardware, and other stuff terminated
    in "ware", like firmware, tupperware, (...)" - by JJ.


+ Reply to Thread
Page 2 of 3 FirstFirst 1 2 3 LastLast