Apps For A Photographer


Chance1234

Still Fresh
Joined
Oct 8, 2008
Messages
13
Hello,

I've stayed away from this forum for quite a while, a year and a half in fact. But from following the blogs and beginning to see some potential that there is a chance I will receive my Pandora this year (probably in the upper 3500-4000 range) I am now trying to get myself excited (but not to excited) about the Pandora.

I was wondering, is there any applications I can use as a photographer, such as simple photo retouch or a Depth of field calculator ?
I ask as I am seeing, if I will be able to add this bad boy to my insurance
 
I plan on using the pandora like a databank to dump my full flash cards to, but some simple
Photoediting app would be awesome. As long as it could handle RAW
 
The Gimp is already out there. It can't currently export to any (non Gimp) file formats or has an filters.... but everything else works! Even layers. A more complete port will arrive anyway.
 
Gimp would need some kind of plugin if you're using RAW. Ufraw is pretty good. It is open source IIRC but i don't know how simple it would be to port to Pandora.
 
I think most GIMP plug ins are written in Python IIRC, so UFRAW should "just work"(tm), the other good thing about the Pandora when used with a camera is you can send your photo's to a big PC back home via ssh or e-mail or whatever and have them all post-processed by the time you get back, plus you get a remote backup of your images, so if some scumbag steals your camera/bag or whatever, you at least still have the photos, useful no end if you work professionally.

edit....I don't recall correctly, Ufraw is written in C, but is open source, so recompiling it for Pandora should not be an issue, unless the source uses asm at some point, not that likely.
 
hobbyman II said:
I think most GIMP plug ins are written in Python IIRC, so UFRAW should "just work"™, the other good thing about the Pandora when used with a camera is you can send your photo's to a big PC back home via ssh or e-mail or whatever and have them all post-processed by the time you get back, plus you get a remote backup of your images, so if some scumbag steals your camera/bag or whatever, you at least still have the photos, useful no end if you work professionally.

edit....I don't recall correctly, Ufraw is written in C, but is open source, so recompiling it for Pandora should not be an issue, unless the source uses asm at some point, not that likely.

Thanks

That's good to know as I use RAW quite a lot. I'm hoping to use the high res screen to review my photo's, the display on the camera isn't clear enough.
 
Last edited by a moderator:
Has anyone tried opening very high def pictures on their Pandora? Even my laptop struggles with them.
 
I just opened a 8000 x 8000 plus 100mb image on my laptop and it took about 1.5 seconds to open (including loading the viewer application), maybe you need a new laptop?, or maybe that 500 terrapixel camera you bought is a bit ott ;)

(that's a 64 megapixel image, most cameras don't go that high, 15 megapixel is considered really good unless you are into very high tec photography)

edit: was just reading up on RAW, the GIMP supports many RAW formats by default in the latest version, try downloading it and see if yours is recognised, success could be just a recompile away, assuming Pandoras GIMP isn't the latest version already
 
My laptop is an older Thinkpad from... 2003. It's a snappy 1.6ghz and runs all my applications well, but it isn't cutting edge, true. :lol:
 
SomeGuy99 said:
Has anyone tried opening very high def pictures on their Pandora? Even my laptop struggles with them.
That's generally an issue of insufficient RAM more than anything else. I remember some years back, one of my room mates was flipping 15mb TIFs back and forth a bit like a flip book. That was on one of those new world macs, the ones that were in use at about the time that OSX was released. But it had some obscene amount of RAM and no visible slow down.
 
Last edited by a moderator:
hedwards said:
SomeGuy99 said:
Has anyone tried opening very high def pictures on their Pandora? Even my laptop struggles with them.
That's generally an issue of insufficient RAM more than anything else. I remember some years back, one of my room mates was flipping 15mb TIFs back and forth a bit like a flip book. That was on one of those new world macs, the ones that were in use at about the time that OSX was released. But it had some obscene amount of RAM and no visible slow down.

Is 1GB not enough?!
 
Last edited by a moderator:
hedwards said:
That's generally an issue of insufficient RAM more than anything else. I remember some years back, one of my room mates was flipping 15mb TIFs back and forth a bit like a flip book. That was on one of those new world macs, the ones that were in use at about the time that OSX was released. But it had some obscene amount of RAM and no visible slow down.

it's been a long time since I heard anyone make a distinction between macs using the New World ROM and the Old World ROM. :p
 
Last edited by a moderator:
Is 1GB not enough?!

that depends, if you are running a cut down Linux like Puppy or DSL, then it's enough, if you are running XP, Vista or Win 7 (aka Vista reloaded) then it's not going to be enough, there's a lot of overhead, memory-wise and cpu-wise with Windows, I mean, think about it, minimum system requirements for the OS?, that's crazy, the OS should take up as little in the way of resources as possible, I remember getting a full GUI OS (dos with a midnight commander like shell on top) on just one 760k floppy with room for data and apps left over, most eight bit computers had their OS/core language in as little as 4 to 16k/Bytes, only machines like C64 and Dragon used large amounts of system ram (~32kb) and IMO thats down to them being written by MS more than anything

Amiga, Atari and Amstrad all had systems in ROM or on a single floppy that where recognisable "point and click" environments, recognised external devices, connected to networks etc, so why modern OS's need gigabytes of HD and high end CPU's/GFX to run, escapes me totally, I could understand it if my system was a full blown AI, but not for Dos with pictures, that's just dumb and wasteful.

IMO modern coding practises are to blame, OOP and the library mindset lead to bloated code, code that expands well beyond what you would expect for it's functionality, you only have to look at "dependency hell" on any Linux platform, install application A, you need libs/applications C,D,E,F, lib/app C needs codec G, libs H,I,J,K, lib/app D needs Python and Python libs L,M,N , app D needs codecs O,P,Q,R,S,T and apps U,V,W, plus libs X,Y,Z

then when you have THAT sorted, you can resolve the dependencies for C,D,E,F,G,H,I,J,K,Python,Ruby,Java,SQL,Perl,etc etc , it's a total mess, when I look at a Linux install, it's with a combination of wonder and horror, wonder that it actualy works, and horror at the thought that someone decided it was a good idea to build a system that depends on innumerable languages (python, perl, ruby, mono, bash, html etc scripts), libraries (just how many sound drivers/mixers/controllers DO you really need?), managers and daemons, cron tasks, etc etc....... URGH!

yes, it's all very "open", but it's also an anarchistic MESS, IMO the OOP paradigm is to blame, with all that abstraction, people are forgetting what is going on at machine level, probably the only people who write with the hardware in mind are driver coders, everyone else just keeps slapping another layer of abstraction on top of the whole heap, even with optimising compilers most of the code is so far removed from the hardware it,s running on, that most of the OS is spending a lot of time emulating virtual machines rather than actually DOING something.

there's no wonder my dual core 2.8Ghz, 3gig ram, 1Tb hd desktop seems sluggish at times compared to my old 14Mhz Amiga, the Amiga had the immeasurable advantage of having the core written for the hardware all the way through, the load on the system for an Amiga is so low that you can Emulate the original machine at crazy speeds on modern hardware DESPITE BEING EMULATED ON AN INEFFICIENT SYSTEM , that has to tell you something.

try using an emulator running an Amiga workbench for a while on a modern machine and it becomes apparent that something is badly wrong, when emulating an whole computer AND it's OS on a modern machine gives you a faster OS than the one on the machine you are running it on <_< , ok, it might not be so pretty, but I don't consider ANY modern desktop so pretty it deserves to own the whole machine just to add drop shadows to the mouse pointer :p


oooh!, I do go on....time to shut up, <pet peeve off>
 
hobbyman II said:
Is 1GB not enough?!

that depends, if you are running a cut down Linux like Puppy or DSL, then it's enough, if you are running XP, Vista or Win 7 (aka Vista reloaded) then it's not going to be enough, there's a lot of overhead, memory-wise and cpu-wise with Windows, I mean, think about it, minimum system requirements for the OS?, that's crazy, the OS should take up as little in the way of resources as possible, I remember getting a full GUI OS (dos with a midnight commander like shell on top) on just one 760k floppy with room for data and apps left over, most eight bit computers had their OS/core language in as little as 4 to 16k/Bytes, only machines like C64 and Dragon used large amounts of system ram (~32kb) and IMO thats down to them being written by MS more than anything

Amiga, Atari and Amstrad all had systems in ROM or on a single floppy that where recognisable "point and click" environments, recognised external devices, connected to networks etc, so why modern OS's need gigabytes of HD and high end CPU's/GFX to run, escapes me totally, I could understand it if my system was a full blown AI, but not for Dos with pictures, that's just dumb and wasteful.

IMO modern coding practises are to blame, OOP and the library mindset lead to bloated code, code that expands well beyond what you would expect for it's functionality, you only have to look at "dependency hell" on any Linux platform, install application A, you need libs/applications C,D,E,F, lib/app C needs codec G, libs H,I,J,K, lib/app D needs Python and Python libs L,M,N , app D needs codecs O,P,Q,R,S,T and apps U,V,W, plus libs X,Y,Z

then when you have THAT sorted, you can resolve the dependencies for C,D,E,F,G,H,I,J,K,Python,Ruby,Java,SQL,Perl,etc etc , it's a total mess, when I look at a Linux install, it's with a combination of wonder and horror, wonder that it actualy works, and horror at the thought that someone decided it was a good idea to build a system that depends on innumerable languages (python, perl, ruby, mono, bash, html etc scripts), libraries (just how many sound drivers/mixers/controllers DO you really need?), managers and daemons, cron tasks, etc etc....... URGH!

yes, it's all very "open", but it's also an anarchistic MESS, IMO the OOP paradigm is to blame, with all that abstraction, people are forgetting what is going on at machine level, probably the only people who write with the hardware in mind are driver coders, everyone else just keeps slapping another layer of abstraction on top of the whole heap, even with optimising compilers most of the code is so far removed from the hardware it,s running on, that most of the OS is spending a lot of time emulating virtual machines rather than actually DOING something.

there's no wonder my dual core 2.8Ghz, 3gig ram, 1Tb hd desktop seems sluggish at times compared to my old 14Mhz Amiga, the Amiga had the immeasurable advantage of having the core written for the hardware all the way through, the load on the system for an Amiga is so low that you can Emulate the original machine at crazy speeds on modern hardware DESPITE BEING EMULATED ON AN INEFFICIENT SYSTEM , that has to tell you something.

try using an emulator running an Amiga workbench for a while on a modern machine and it becomes apparent that something is badly wrong, when emulating an whole computer AND it's OS on a modern machine gives you a faster OS than the one on the machine you are running it on <_< , ok, it might not be so pretty, but I don't consider ANY modern desktop so pretty it deserves to own the whole machine just to add drop shadows to the mouse pointer :p


oooh!, I do go on....time to shut up, <pet peeve off>

You get taught that OOP and libraries are to make life easier and IIRC because 'gosub' and 'goto' were often misused leading to hard to maintain code.

As you say it may have solved one problem while causing another. I've don't understand why modern PC's don't boot from rom. If they did they have the potential to boot faster (like my Commodore 16) and the fixed amount of space makes you really think what the core system REALLY needs. User wants could be stored else where (hd, sd..etc). Part of want attracted me to the Pandora.
 
Last edited by a moderator:
hobbyman II said:
Is 1GB not enough?!

that depends, if you are running a cut down Linux like Puppy or DSL, then it's enough, if you are running XP, Vista or Win 7 (aka Vista reloaded) then it's not going to be enough, there's a lot of overhead, memory-wise and cpu-wise with Windows, I mean, think about it, minimum system requirements for the OS?, that's crazy, the OS should take up as little in the way of resources as possible, I remember getting a full GUI OS (dos with a midnight commander like shell on top) on just one 760k floppy with room for data and apps left over, most eight bit computers had their OS/core language in as little as 4 to 16k/Bytes, only machines like C64 and Dragon used large amounts of system ram (~32kb) and IMO thats down to them being written by MS more than anything

Amiga, Atari and Amstrad all had systems in ROM or on a single floppy that where recognisable "point and click" environments, recognised external devices, connected to networks etc, so why modern OS's need gigabytes of HD and high end CPU's/GFX to run, escapes me totally, I could understand it if my system was a full blown AI, but not for Dos with pictures, that's just dumb and wasteful.

IMO modern coding practises are to blame, OOP and the library mindset lead to bloated code, code that expands well beyond what you would expect for it's functionality, you only have to look at "dependency hell" on any Linux platform, install application A, you need libs/applications C,D,E,F, lib/app C needs codec G, libs H,I,J,K, lib/app D needs Python and Python libs L,M,N , app D needs codecs O,P,Q,R,S,T and apps U,V,W, plus libs X,Y,Z

then when you have THAT sorted, you can resolve the dependencies for C,D,E,F,G,H,I,J,K,Python,Ruby,Java,SQL,Perl,etc etc , it's a total mess, when I look at a Linux install, it's with a combination of wonder and horror, wonder that it actualy works, and horror at the thought that someone decided it was a good idea to build a system that depends on innumerable languages (python, perl, ruby, mono, bash, html etc scripts), libraries (just how many sound drivers/mixers/controllers DO you really need?), managers and daemons, cron tasks, etc etc....... URGH!

yes, it's all very "open", but it's also an anarchistic MESS, IMO the OOP paradigm is to blame, with all that abstraction, people are forgetting what is going on at machine level, probably the only people who write with the hardware in mind are driver coders, everyone else just keeps slapping another layer of abstraction on top of the whole heap, even with optimising compilers most of the code is so far removed from the hardware it,s running on, that most of the OS is spending a lot of time emulating virtual machines rather than actually DOING something.

there's no wonder my dual core 2.8Ghz, 3gig ram, 1Tb hd desktop seems sluggish at times compared to my old 14Mhz Amiga, the Amiga had the immeasurable advantage of having the core written for the hardware all the way through, the load on the system for an Amiga is so low that you can Emulate the original machine at crazy speeds on modern hardware DESPITE BEING EMULATED ON AN INEFFICIENT SYSTEM , that has to tell you something.

try using an emulator running an Amiga workbench for a while on a modern machine and it becomes apparent that something is badly wrong, when emulating an whole computer AND it's OS on a modern machine gives you a faster OS than the one on the machine you are running it on <_< , ok, it might not be so pretty, but I don't consider ANY modern desktop so pretty it deserves to own the whole machine just to add drop shadows to the mouse pointer :p


oooh!, I do go on....time to shut up, <pet peeve off>

+1

You might be interested to know that XP is actually faster than Linux, even Puppy. With some optimisation, I can run XP on a Pentium 2 200 wit h 128mb of RAM - that's pretty much the lowest spec XP will accept... and it's still faster than Linux.

The problem is hardware choice. You can choose what your PC contains... that in itself requires an 'operating system'. Workbench was partially held on ROM (not just floppy), which I wasn't aware of for many years. And as we get more and more things we expect a PC cope with -- new technologies and such -- the bloat just continues to grow. But poor software design and optimisation is still largely to blame.

Microsoft Office still does essentially the same stuff as the Windows 3.1 versions. The same goes for Photoshop, and many other longstanding packages. Why do the system specs increase so dramatically every year? That is ridiculous.

Every development team should be forced to use five or ten year old hardware, but instead they have the best machines money can buy. The stuff they write appears to work just fine... for them.

OSX is the worst offender. Everything there is built to target the latest Macs. If I run a really functionally basic program on my original Imac, it slows to a crawl - trying to type in MS Office is like living in slow motion. Why?! I can fire up a much older version which does exactly the same thing, and it's fine.

Death to bloat!
fail_20at_20failing.jpg
 
Last edited by a moderator:
SomeGuy99 said:
The problem is hardware choice. You can choose what your PC contains... that in itself requires an 'operating system'. Workbench was partially held on ROM (not just floppy), which I wasn't aware of for many years.

Hmm, no. Kickstart was contained on ROM, not Workbench. Kickstart contained the bootstrap, the basic windowing system (no text-only mode in AmigaOS), and the basic AmigaDOS command set.

Workbench had to be loaded from floppy.

As a side note, on the original Amiga 1000, Kickstart also needed to be loaded from a floppy.
 
Last edited by a moderator:
Pleng said:
SomeGuy99 said:
The problem is hardware choice. You can choose what your PC contains... that in itself requires an 'operating system'. Workbench was partially held on ROM (not just floppy), which I wasn't aware of for many years.

Hmm, no. Kickstart was contained on ROM, not Workbench. Kickstart contained the bootstrap, the basic windowing system (no text-only mode in AmigaOS), and the basic AmigaDOS command set.

Workbench had to be loaded from floppy.

As a side note, on the original Amiga 1000, Kickstart also needed to be loaded from a floppy.

So it was partially held on ROM. The fact that Amiga DOS and the windowing system, and the basic machine functionality were half loaded/run from there only supports this right?

I have an Amiga 600 btw.

Wikipedia:

Workbench.library in its first versions even occupied no space on system floppy discs, because it was part of the system ROM. Starting from 2.0 it became a shared library in Libs: and could be replaced by third-party GUIs.

The AmigaOS library API's required by WorkBench were stored in ROM, or (on the earliest Amigas) loaded into WCS/WOM (Lockable/Write Once Memory) by the Kickstart system. Applications launched from either the CLI or Workbench executed equivalently, with both having full GUI functionality. Workbench launched applications were meant to report their successful launch back the Workbench, but this was not a requirement and few actually did. The CLI was entirely graphically based; the Amiga did not support character mapped displays.
 
Last edited by a moderator:
OK, some of this stuff I can agree with, but some of it is absolutely outrageous.

For one thing, Windows still has dependency hell. The only reason it doesn't appear is because every program has everything either linked in or included in the program directory. Linux fixes this by having you install only 1 copy of each library, so the process is much more visible.

I can also agree that the several levels of abstraction are bad, but as far as modern OSes and modern hardware go, you pretty much need at least the one common interface, so your application can work on multiple platforms and so programs can work together in a stable and secure way, so if one program decides to crap itself, it won't take everything else down with it (See: early Windows). And direct hardware access from an application will pretty much limit you to only being able to use that one application alone. Another thing to note is that modern hardware is generally many many times more complex than the hardware that makes up an Amiga or Commodore 64, and also poorly documented or not documented at all, or even illegal to try to reverse engineer and use directly.

I think a happy medium does need to be set between ease of programming, stability, portability and speed, so expecting everyone to write everything in pure ASM and working directly with the hardware is just nonsense and completely ridiculous. Some things need speed, and they use assembly optimizations and some things need to be a bit more robust or portable, and they use stuff like python and some things are just fine in the middle, so they use stuff like C and C++.

Also, the use of "libraries" is not the reason for bloat and slowness as much as the use of all kinds of "abstraction libraries". Something like SDL is a good thing in that it can work with whatever OS you feel like compiling it on with very little rewriting and if used right, it pretty much translates pretty closely to what you'd write if you used X11 or directx directly, but stuff like pulseaudio are a bit ridiculous, because OSS and ALSA pretty much already provide most of the same services, and pulseaudio only really adds a few functions that really aren't worth the overhead, especially on something like the pandora or some other low power device. As for libraries themselves, why reinvent the wheel when there have been many people working towards the same problem and have a likely faster solution? The only overhead there is maybe a handful of instructions to perform the function call, and anything that needed the speed that badly would likely be inline anyway. I can't really comment on OOP since I haven't used it too much.

EDIT: Sorry if this is unclear or anything, I have trouble keeping things straight in my head sometimes. Never was a particularly good writer. :p
 
Back
Top