Luminar 4 doesn't use the GPU

Answered

Comments

33 comments

  • Avatar
    Colin Grant

    Hi Steffan,
    To be honest Luminar was never going to be an Affinity, it was supposed to be a Lr but it never made that either. Have had Affinity since the start. It is good but it is just a Ps clone (a very good one at an excellent price) and I much prefer the Adobe stuff. Raw editing in Affinity is not a pleasing experience to me. All that said, Affinity is a far more professional and robust piece of software and with excellent support. 

    In my experience most software houses announce and deliver most of the time. There are occasion when they get it wrong of course but most of the time they get it right and acknowledge when it is wrong by putting it right quickly. Skylum seem to adopt a practice that is the complete opposite! I will never buy the software again no matter how it develops. 

     

     

    3
    Comment actions Permalink
  • Avatar
    Daniel

    I just purchased Luminar 4. Before buying this software I used Lightroom and everything worked soo smoothly there. You really have to focus on improving the performance of this software. The interface is pretty nice and not so full of unnecessary stuff, I don't think it needs rework until the performance issue is resolved because it's pretty much unusable and the experience is awful. 

    I really didn't know anything about this app, but I saw some tutorials on YouTube and it seemed like the app I really needed in terms of photo editing. The think that bothers me the most is that this is version 4 and as far as I've read the performance issue is well known since the beginning and nothing has been done regarding this issue. No new version should be released with this major problem. The UI is unimportant compared to that.

    I really don't think the problem is that it doesn't use the GPU, but the algorithms used, especially the AI part which is a nice feature but makes the app unusable. Please improve the algorithms a lot, because it's not a small issue, it makes all the difference. As I said, the interface is pretty good and user friendly but is no use to have it if you have to wait so long for any change you make to a slider. It's like having a good looking website but when you want to navigate from one page to another you have to wait 5, 6 seconds. It's not normal, especially for an app that reached version 4.

    Please focus on improving the algorithms as soon as possible and maybe give up on some that simply cannot be improved, I assume those would be mostly the AI ones.

    2
    Comment actions Permalink
  • Avatar
    Daniel

    If you think GPU is that important I have this post for you: 

    "Starting from August 2019 (version 8.4) release of Lightroom Classic, you can enable GPU for image processing on supported computers to accelerate image editing",  taken from https://helpx.adobe.com/ro/lightroom-classic/kb/lightroom-gpu-faq.html

    So if Lightroom was fast without using GPU, so can this software. I agree that it would be better to use the GPU, but I don't think that's the main problem. I am pretty sure a complex algorithm can really slow the time of editing, especially when dealing with RAW files.

    Anyway, this is not the scope of this thread, the main point is that the software needs a lot of improvement regarding the backend part of the app.

     

    2
    Comment actions Permalink
  • Avatar
    Alan Cockburn

    Interesting stuff and it's great to know I'm not alone with the speed issue, as initially I thought it was me as I'm only setting off into photography so I'm a little green/unsure around these issues. I too are considering purchasing a new all singing and dancing MacBook Bro so its great to read the info from you all.

    Interestingly I've just downloaded Adobe lightroom/PS to give it a wiz on my old 2012 MacBook Pro updated 16Gb Ram and updated SDD drive, and it works fine. I'm no professional in these matters but at the end of the day I'm only looking for functionality to deliver an output, this isn't happening Skylum and you need to realise people have paid good money for a product which is expected to be designed, developed and rigorously testing. It looks like this hasn't happened so I for one will be voting with my feet and moving to lightroom/PS. 

    2
    Comment actions Permalink
  • Avatar
    Steffen Klaer

    Dear Kate,

    I appreciate your quick reply although I would have preferred a different answer. Please get me right: I am taking the time to help you improve Luminar especially from a strategic point of view.

    Criticism is not my target but instead a long-term relationship with good balance of functionality, stability, speed and cost. If either of these items drops out of balance with your product, it will fail one way or the other.

     

    1. Performance. Your time response on rendering is understood and helpful to show me that most likely tere is no mistake on my side but it should certainly be a trigger for you to look into your competitor's perfornance. Please make sure you do this with emphasis within your organisation. Don't take it easy.

     

    2. GPU Timeline. What you are writing is contradicting previous entries from Helga Egilsdottir, Victoria Grace and other community entries. Is there a roadmoap or not?

    If GPU ETA is still not fixed or even under process, please again exert sufficient pressure to get a reliable and honest response for this community. I appreciate your previous response within hours on a Saturday but I believe we would all be happier if you took the time to investigate internally and then post a response we can build on.

    I feel a similar situation occured repeatedly with Luminars' release quality. Software could not be downloaded, started, froze and many more. Quality is a big issue these days and should be superior to an over-early release. You can destroy trust  a product easier than re-build it. This is true for any product, by the way. Please consider this for your next team-meetings. And please also consider this: If you now commit to a fixed date, say Q1/2021, you set yourself a target and do not shake around anymore. Targets are good thing to achieve results and serve as enterprise guideline. I count on you.

     

    Thank you & best regards,
    Steffen

     

    PS. My hardware is Windows-based so I might have accidentally placed my comments in the wrong place. It does not make the statements above less valid, though. Please feel free to move this to a different thread but send me a link to the new location.

    2
    Comment actions Permalink
  • Avatar
    Colin Grant

    Steffan we are more likely to witness a squadron of pigs flying by than seeing Skylum take your suggestions/demands seriously. Issues have been there since inception and there has never been a reliable timeline (indeed any timeline) for putting them right. We have also had road maps that have been temporary in that they have been subject to change or complete abandonment. Skylum are what they are. Their business model and customer service does not appeal to me but it is what it is. Have you asked yourself why in preferences there is a switch to turn the graphics processor on and off?

    All I will say is 4.3 is running here better than the previous releases. It is far from perfect in many ways but it is better. It works well enough as a Lr plugin and that is the only way I will use the software given the state of that so called dam. Personally I do not see Luminar as a future replacement for the likes of Adobe or C1  but I can see how it suites some users.

    2
    Comment actions Permalink
  • Avatar
    Colin Grant

    The GPU toggle does absolutely nothing so far as I can see. Others have reported the same. The question therefore is why is it there! Skylum are saying that implementation remains on the list but for who knows when. Smoke and mirrors have ruled this development since inception. It will not change anytime soon. My view is it will never change - not even the Support staff are consistent in what they say. Still it is a fair enough plugin as some of the filters are useful. Not sky replacement though - what is the point if there is no manual workaround should the AI stuff not pick up the sky - as happens when the sky is less than 5% (I think) of the image.

    2
    Comment actions Permalink
  • Avatar
    Colin Grant

    So that suggests that L4 will never get it and AI won’t have it at launch. All sounds a bit like the promise to add exif support to L3 and 4.

    2
    Comment actions Permalink
  • Avatar
    Colin Grant

    Helga, all well and good but would it not be better to get the software optimised before developing more state-of-the-art tools. In any event, why are not the new tools using the GPU more anyway? And when exactly to you intend to "get there"? I do not like the sound of "long term tasks" when we are talking about a version 4 piece of software. I suspect others feel likewise!

    1
    Comment actions Permalink
  • Avatar
    Colin Grant

    Skylum have been saying that for four versions. The promise is beginning to wear a bit thin Victoria Grace

    1
    Comment actions Permalink
  • Avatar
    David Kelly

    It's insane that Luminar doesn't use the GPU. A photo app that doesn't use the fast video processor shows poor design. I would call it a fundemental flaw.

    Offloading processing functions to the GPU would make a _massive_ difference to performance. Skylum, you guys should prioritise this, if for no other reason than to be free from all the complaints you get from annoyed customers about how slow your app runs.

    1
    Comment actions Permalink
  • Avatar
    Colin Grant

    You are wrong, the GPU is a significant part of the problem. Every other serious editing package makes heavy use of the GPU - Lr, ON1, Cap One and Topaz for a start. So please do a little research before just grasping at straws and blaming the AI algorithms. It is not just the AI stuff that is slow - working with several layers can be slow, sliders can become laggy or frozen, image refresh can be slow and exporting is a joke. That is not to mention the library and corruption of the database. Maybe the algorithms in places could be better, maybe not. I'll settle for leaving the algorithms alone and getting back to the basic performance issues where the GPU plays a significant part.

    1
    Comment actions Permalink
  • Avatar
    Steffen Klaer

    Dear Skylum Team,

    not often that I drop a post acitvely but here I feel urged to do so because I like your approach, ideas and AI features. I agree to the members above requesting a performance boost by using GPU. I understand that this is not done overnight. However please take into account the following notes:

     

    1. My Configuration:

    . State-of-the-art board with i7-9700, 3.0 GHz and 8 cores 
    . 32 GB RAM DDR4
    . dedicated internal SSD for docs and pictures, no software installed on those.
    . NVIDIA 1030 with 2 GB DDR5. Ok, the latter is not high-end but the other components are quite strong.
    . Luminar 4.3.0 (build 6325)

     

    2. Target Group. Those who buy and use your software are most likely ranged above the average user who is probably not aware of such a thing as RAW format and how to use it (no offense!). For simply collecting, storing and reviewing .jpg pictures your software is too expensive and there are several alternatives on the shareware or freeware market. In order to simplify, let me call your target group "semi-professional up to professional". For this target group time is more costly and valuable than the average shooter and their expectations are higher. For my system, the time Luminar needs to make the picture ready on screen for processing or AI ranges between 0-5 seconds. This is not acceptable, compared to e. g. ACDSEE Pro 2020. There, we talk around at most 1/10-1/5 sec, so I barely notice. Which brings me to the second thought:

     

    3.  Alternatives and Competition. Feature-wise your app is currently fine and a strong alternative to Lightroom or others. Speed-wise it is not up to expectations. Both ACDSEE Pro and Affinity Photo are there instantly on above system. Both use a dedicated GPU with UI options to swap if needed. We are writing anno domini 2020 with strong GPUs on the market for more than 20 years! Use them from development scratch, please.

     

    4. Support. In case you have any suggestions for the community and me aside of the above to increase speed, I will be eager to learn and revise this comment. 

     

    5. Conclusion. I understand that features are a sales argument and a booster on websites, reviews and advertising. But please also take into account the frustration during trial period where users find a laggy performance. I simply did not expect that as speed is not an obstacle with todays hardware anymore. Your features are great, your performance poor which in total is not more than average. Please fix this quickly by using a dedicated GPU and show this on setting page. I am sure you will make more than one user happy and a long-term customer.

    And please give us a timeline for implementing GPU or other speed increase. Any educated estimate is better than nothing!

     

    Looking forward to your reply,
    Steffen

    1
    Comment actions Permalink
  • Avatar
    Florian Schnarr

    Hi Colin,

    at least this is the way I understand Helga. I had very high hopes that GPU usage would be added to L3 and later to L4. Given these circumstances I doubt it will either be in L-AI at launch or anytime beyond that. 

    It's really sad since I like the software overall, but I will turn back to C1 for good. 

    1
    Comment actions Permalink
  • Avatar
    Steffen Klaer

    Dear Colin,

    you are probably right. I am somewhat frustrated that Skylum apparently reads these requests and appeases with words and comments. When it boils down to the ground, there is no confirmation or ETA.

    And to hear that there were announcements even for L3 is even worse.

    For my part, I will keep an eye on this software but will not recommend or buy it unless there is a guarantee that GPU support is available. And I mean implemented, not only announced.

    Best regards to all
    Steffen

     

    PS. To give you an idea that all this can be done, have a look at Affinity. They announce and deliver!

    1
    Comment actions Permalink
  • Avatar
    Michael Marriott

    I agree with Steffan and others.  It is puzzling that they continue to over-promise and under-deliver so consistently with respect to key infrastructure issues.  I really like and am impressed by their AI tools, and their user interface, but given the poor underlying infrastructure, it is a bit like lipstick on a pig.  I have never experienced this lack of integrity and response from any other vendor.

    I have promised myself not to buy anything more from Luminar until the provide GPU support and show progress on an enhanced DAM. 

    Cheers!

    Michael

    1
    Comment actions Permalink
  • Avatar
    Helga Egilsdottir

    Hello David,

    At this time, Luminar relies more on CPU and RAM than GPU, which is why you did not notice any actions when checking GPU history.

    To make Luminar rely more on GPU is one of the major tasks in our docket - we're constantly working on the optimization of the editing algorithms and the way it uses resources. However, I do feel your pain caused by current performance of the software and share it too. Our devs are working day and night to make it go away, but it's taking time as everyone's doing their best to combine the development of state-of-the-art editing tools and optimizing the software. 

    We have several large-scale tasks which in a long-term perspective will give a boost to both the way the app performs and to the development process itself. 

    Thus we're getting there. Stay tuned! 

    For now, the best way to speed up the app is to try dividing your Library into several Catalogs (i.e. loading a part of the photos into one catalog, then create a new one via File > Catalog > New and load another part of the images there). You'll be able to switch between the catalogs via File > Catalog > Open. 

    Another good idea is to keep both the catalog and the photos on the internal drive. 

    0
    Comment actions Permalink
  • Avatar
    Victoria Grace

    Hey Colin,

    We'll do our best to improve this in our future updates. Thank you very much for taking time to share your feedback. We do appreciate it and take into account any suggestion we receive.

    0
    Comment actions Permalink
  • Avatar
    Colin Grant

    Lr has been around a longtime and it was getting slower and slower as the functionality increased. Adding the GPU helped big time. Luminar on the other hand has been a performance dog since version 1 and AI was not included then. 

    0
    Comment actions Permalink
  • Avatar
    David Kelly

    Adobe was slow to add GPU support to LightRoom and there were a lot of complaints about its speed. Adobe has the excuse of a large legacy codebase that has been in development since 2007. Luminar is relatively new. Competitors like PixelMator were making use of the GPU way back in 2014, so there's no excuse for Skylum not to add GPU support. In fact, they should have done this from the beginning.

    I agree that Luminar should be a lot faster even without GPU support. Adding a layer with nothing more than an exposure adjustment slows down the editing of a single image quite a bit, which is ridiculous. Moving functions to the GPU, which is highly optimized for image processing, is the easiest and quickest way to improve these performance problems without having to rewrite large sections of code.

    Please Skylum, take note and work on a quick win for a performance boost!

    (I'm a developer just FYI)

    0
    Comment actions Permalink
  • Avatar
    Kate Williams

    Hi Steffen,

    Thanks for taking the time to give such thoughtful feedback. Much appreciated! 

    At this very moment, up to 10 seconds for the RAW file rendering is considered to be an average speed in Luminar. 0-5 second for the RAW to get rendered is actually a pretty decent result. Currently, not much can be done on your end right now, except for the suggestions outlined above. 

    Right now, we have no set timeframe for the GPU acceleration, but we do plan to include more performance improvements in the following releases. We'll share a more precise ETA as soon as it is available. 

    0
    Comment actions Permalink
  • Avatar
    Elena Blum

    Steffen Klaer,

    We appreciate your suggestions. The experiences and contributions of our users are very valuable to us.

    I can definitely see how you would like to have features and improvements like that.

    At the moment, I have no ETA for the update that would address this; however, I'll make sure to pass your comment onto the team in charge for their consideration. 

    0
    Comment actions Permalink
  • Avatar
    Steffen Klaer

    Dear Elena,

    thanks for your response. I kindly ask you to feedback a binding timeline no later than Sep 15, 2020 on what is planned to speed up Luminar. Or, then honestly feedback that this is not a priority for the next years! Like said before, no binding is frustrating for customers and for you as well because you get this kind of questions and bragging from all sides. A certain commitment is good for you as well. Not using GPU, Luminar seems to be the only package in the market walking that way.

    Final remark: In my teammeetings I use the "empty-chair-method" by default: In each meeting, there is an empty chair representing the customer. After the meeting, the team and I then consequently answer the question: "What would the customer say about this meeting? How would she/he rate this with respect to satisfaction?" Give it a try. It is a great approach.

    Best regards,
    Steffen

     

    PS. Below a few more link I found on this within 5 minutes. It is a huge issue!

    https://community.skylum.com/hc/en-us/community/posts/360007004640-Luminar-4-GPU-Acceleration-for-PC-

    https://community.skylum.com/hc/en-us/community/posts/360006599380-Luminar-4-and-GPU-usage

    https://community.skylum.com/hc/en-us/community/posts/360043584892-GPU-usage-question-

    https://community.skylum.com/hc/en-us/community/posts/360008048579-Luminar-4-doesn-t-use-the-GPU

    https://community.skylum.com/hc/en-us/community/posts/360008267120-Luminar-4-and-nvidia-gpu

    0
    Comment actions Permalink
  • Avatar
    Steffen Klaer

    Dear Colin,

    thanks. Im am definetly lacking history but I also like to give credit and express my hopes and expectations before giving up and asking for a refund. I sure hope Skylum will prove both of us wrong.

    Sometimes time is right and one drop is sufficient to cause barrel overflow (typical German saying). A said, I like the features ans AI approach and hope for a long relationship.

    Unfortunately I do not see the GPU toggle in my UI (Windows version). What does it do?

    Best regards,
    Steffen

    0
    Comment actions Permalink
  • Avatar
    Florian Schnarr

    For all who follow this, it's somewhat a shame this post (like many similar ones) are marked "answered". I would consider a post answered when the OP consented the issue was solved, not Skylum support simply gave a reply. 

    Given these circumstances, I can only advise you to NOT purchase their upcoming LuminarAI product as they have not even been able to fix V3 and V4 (although promising so for ages now). 

    Anyone what to bet on whether the new successor product LuminarAI will use GPU for rendering??? 

    0
    Comment actions Permalink
  • Avatar
    Colin Grant

     I guess this thread is closed as L4 will now never get gpu support. As to the future I am not guessing but then I can see no reason why I would ever trust Skylum again or buy their offerings.

    0
    Comment actions Permalink
  • Avatar
    Helga Egilsdottir

    Dear Florian Schnarr,

    I'm terribly sorry to hear about your impression, that we decided to leave this problem without attention. Let me kindly assure you that even after announcing Luminar AI, we keep supporting and improving Luminar 4. Skylum Community forum was created to discuss and track all the suggestions that our users share with us and we do monitor each post and comment. Our team did not decide this issue is resolved, so it is currently in process. We would love to implement all the suggestions and feedback that you kindly provide, however some of them take more time to integrate than we expected. Please, bear with us and stay tuned. We do hear all the suggestions and take them into consideration. Appreciate all your kind input!

    0
    Comment actions Permalink
  • Avatar
    Colin Grant

    So Florian Schnarr you can categorically confirm that gpu support will be added to L4 shortly and Luminar AI willl have it at release?

     

    0
    Comment actions Permalink
  • Avatar
    Tyler Jensen

    The failure to take advantage of the GPU is a serious problem. I'll be moving my workflow back to Lightroom despite some of the unique features offered in Luminar.

    0
    Comment actions Permalink
  • Avatar
    Ary Djajadi

    I bought LR4 that I hardly use because it is unusable.  I bought it without trying because I loved the features.  I have been using LR since LR 2018 and it is just getting slower on every release.  Do you guys use your own software?  It is not usable without utilizing GPU, c'mon man.

    0
    Comment actions Permalink

Please sign in to leave a comment.