Welcome to Our Alpha Shooters Community Forum

We'd love to welcome you on board, join today!

Hello from new (old) guy in sunny Stoke-on-Trent

smilo

Newcomer
Followers
0
Following
0
Joined
Jun 18, 2025
Posts
3
Likes Received
3
Trophy Points
3
Name
Steve Miles
Thought I would join up and see what I can learn from you guys. I doubt there is much I can contribute but you never know! Most of my earlier photos are from previous travels taken with an assortment of compact cameras. I then bought a Zenith with a telephoto lens for motorcycle shots in 14 visits to the Isle of Man TT races. Then a few of the TZ series Panasonics before I started to get serious and graduated to a Sony NEX, bought in Australia (which eventually died with a very unhelpful 'software error'). In Australia I had been exposed to bird and wildlife photography and once home (I almost jumped ship at that point but decided to stick with Sony) I bought an A6000 with a PZ and 55-210 lenses. A few years on and I bought a Sony 100-400GM lens for greater zoom and better optics, then after a few years graduated to my present A7RC. I loved the A6000 (even with the big lens) but I'm not yet getting the same feeling for the A7RC. It's excellent with a small lens for landscapes and portraits but for wildlife, I have a few issues. Hence joining up here to see what I can learn!
I take photos mainly for me and print/frame the best. I love photos and I have scanned thousands from before digital, as a family or travel record but obviously I have become more interested in the actual photo quality since the change to digital. My family and friends sometimes get to see the best ones with birthday cards, photobooks, prints or calendars! I don't do any social media!
 
Welcome. What made you go with the A7C-R over the A6700? You outgrew the APSC lens offerings? Or just wanted more megapixels to work with?
 
I bought an A6000 with a PZ and 55-210 lenses. A few years on and I bought a Sony 100-400GM lens for greater zoom and better optics, then after a few years graduated to my present A7RC. I loved the A6000 (even with the big lens) but I'm not yet getting the same feeling for the A7RC. It's excellent with a small lens for landscapes and portraits but for wildlife, I have a few issues.
Though a few of my favorite photos are from my A6000 + 55-210 days I can say that I don't miss it. It was a great place to start.

So the A7Rs are not really designed for wildlife, especially with silent shooting, as the sensor is not very fast but that doesn't mean it can't be done. First thing I would suggest is that for all wildlife use the mechanical shutter. Second is that for fast action stuff, think birds in flight, that you use one of the AF areas that is either center or use the APS-C mode with wide. Doing either of these will help with reducing the work the AF has to do and will improve the tracking. The 100-400GM will have no issues keeping up.
 
Welcome to the forum @smilo. Hopefully some of us can be of help to you.
 
Hi guys,

I know the A7Rs are not primarily designed for wildlife but I assumed that for my all round use it would still be a real improvement over the A6000 and mostly it is. The move to the A7CR was for more megapixels and the fact that the A6000 was a quite a few years old and the latest cameras offer so much more technology.

This brings me to my first real question: Sony (and other manufacturers) use phrases like 'deep learning' for their latest AI camera software. I interpret this as the software recognises your photos and improves the cameras response as you take more and more photos. For example, assuming you have bird detection on and you take 2000 photos of birds, the camera will learn from this and will improve and recognise birds more easily in future.

So if after taking the 2000 photos, your latest kingfisher is sitting in the centre of the frame (in good light with tracking wide and AF-C set) and there are a few branches around the frame (perhaps using centre spot here), then it will have no problem detecting the target and focus on it immediately.

Is my interpretation correct, are others experiencing this or am I expecting too much of the software to improve as it 'learns'?

As I'm asking this, you may have guessed that I have not detected any 'improvement' since my camera was new. The A7RC and Sony 100-400 provide some good shots but often have to wait for the AF to find the target or switch to DMF and focus manually. I appreciate it's not the fastest AF camera but my 2000 birds were mostly sitting still!

I appreciate any advice or comments, Thanks.
 
What settings are you using for focus?

Where do you feel that the camera is letting you down?
 
Last edited:
I think the "deep learning " and other techie words probably sounded good to a marketing executive and got used in advertising without being fact checked by engineers. The autofocus algorithm is greatly improved over the a6000, but do I think my A6700 is improving said algorithm? Nah. But does it bother me? Also nah. Its still the best AF in the industry (maybe I'm a little biased).
 
This brings me to my first real question: Sony (and other manufacturers) use phrases like 'deep learning' for their latest AI camera software. I interpret this as the software recognises your photos and improves the cameras response as you take more and more photos. For example, assuming you have bird detection on and you take 2000 photos of birds, the camera will learn from this and will improve and recognise birds more easily in future.
That is how AI learns but if anything like that is going to happen it will be through firmware updates. Unless I have been missing something there is no point in the photo taking process in which I identify what in the photo the subject is, without this there is no way for the system to learn.
 
That is how AI learns but if anything like that is going to happen it will be through firmware updates. Unless I have been missing something there is no point in the photo taking process in which I identify what in the photo the subject is, without this there is no way for the system to learn.
This is what I was thinking. What/where is the user input required to tell the AI that it performed the task correctly? So I just consider it marketing gibberish and enjoy my camera.
 
I know where you're coming from.

The a7cr is brilliant in virtually all respects. The two areas that let it down where wildlife is concerned are the viewfinder and the handle ability. Both are factors of it's USP. It's tiny and it has huge resolving power.

The 100 - 400 is a great lens and in crop mode, makes a fair wildlife lens. Pop it on an a7cr and it becomes less impressive as it has quite small reach and is quite slow.

The viewfinder for me is the killer, it really doesn't have the size or resolution to place the point of focus accurately.

Frame rate is also fairly low for birds in action.

That said, I love my a7cr, but rarely use it for wildlife. Mostly travel and street, where its diminutive size and massive sensor make it ideal. I can use a discrete lens and crop to my heart's content.
 
The move to the A7CR was for more megapixels and the fact that the A6000 was a quite a few years old and the latest cameras offer so much more technology.
I just got back from a vacation where I really wished my A6700 gave me more megapixels to creatively crop in post-editing. It reminded me about your comment and I think you made a great choice.
 
Thanks guys for your input, it has encouraged me to do some deeper testing and try and come up with some results so I've now got some data together with 100 photos of a Kingfisher taken over a 90 minute period a few days ago.

Shooting Environment:
I'm in a bird hide parallel to the river. The 'perch' is a branch set up (about 10 metres away) on my side of the river, at 90 degrees to the river so most of the time the Kingfisher is sideways on or facing away looking across the river. It's morning so the sun is in the east on the other side of the river, causing the side of the bird facing the camera to be in shade. Not ideal but you can't choose the weather (it's the first time I've seen the Kingfisher here this year). In the foreground there was some tall grass which was blowing left and right and interfering with the shot.

The Kingfisher is a small bird with virtually no tail so depth of field is not really an issue. Obviously very distinctive and not to be confused with the background. I wasn't too concerned about the best light settings, I wanted a reasonably fast shutter to limit any blur from movement.

Camera settings:
A7CR with SEL100-400GM / Manual dial/ AF-C / Centre fix / Mechanical shutter / Single shot / Recognition target BIRD / Subject recognition ON / RAW + Jpeg (jpeg fine) 1/1000 or 1/1250 and F5.6 or 6.3 / ISO 400 (perhaps I should have been significantly higher here to avoid the shadows, perhaps next time)


I have put the results into 3 groups:

1. EYE-BOX: 36 where the camera has produced a small eye focus box. However this focus box was sometimes on the back of the head not over the actual eye but close enough to make no difference. All but one were in focus. This one did not focus on the bird and produced a blurred image. This even worked when the birds head was turned away and it could just pick up the eye as a slight curve on the edge of the head profile. I am very happy with these photos, although many were deleted as the back of the head is not really that interesting! However, it shows that the AF/ AI is doing its job so this 'eye-box' score is 9/10

See these (eye focus on both almost):
1750693598912.jpeg
  • Galaxy S24
  • 5.4 mm
  • ƒ/1.8
  • 7142857/250000000 sec
  • ISO 400
1750693676364.jpeg
  • ILCE-7CR
  • FE 100-400mm F4.5-5.6 GM OSS
  • 400.0 mm
  • ƒ/5.6
  • 1/1250 sec
  • ISO 400
Bird in focus


1750693735487.jpeg
  • Galaxy S24
  • 5.4 mm
  • ƒ/1.8
  • 2016129/62500000 sec
  • ISO 500
1750693773473.jpeg
  • ILCE-7CR
  • FE 100-400mm F4.5-5.6 GM OSS
  • 400.0 mm
  • ƒ/5.6
  • 1/1250 sec
  • ISO 400
Grass in focus

2. CENTRE-BOX: 31 where the camera has a set a focus box approximately around the centre fix. This is the interesting part. The camera could not detect the eye and this is probably to do with the shadow on this side of the bird. So the camera falls back to a bigger box around the centre. Whether the bird was completely in the box or partially in the box, the focus was almost entirely set by the grass somewhere within the box. This is where AI is completely failing, it should be trying to identify a bird or part of a bird within the box, before searching for grass or branches. As 20 of these ignored the bird completely and focussed on the grass, the 'centre-box' score is 2/10

See this (grass perfect focu!):

1750693893414.jpeg
  • Galaxy S24
  • 5.4 mm
  • ƒ/1.8
  • 1/100 sec
  • ISO 1250
1750693933863.jpeg
  • ILCE-7CR
  • FE 100-400mm F4.5-5.6 GM OSS
  • 400.0 mm
  • ƒ/5.6
  • 1/1250 sec
  • ISO 400




3. DMF: 33 photos where I used DMF as I could see previous shots had not focused properly so this gives a chance to fine tune the focus. Obviously, a human is slower than AI but I could wait till the grass was in a favourable position and the bird was looking the right way before taking the shot so the 'throw away' rate was minimal. The fact that the Kingfisher spends a long time sitting in one place and just moving its head around means the DMF is a good option and AI is not really necessary.


Summary

I am a retired software engineer (IBM mainframes) with no AI experience, other than playing with CHATGPT! I can see how AI can aid the photographer but as some of the previous posts have said there has to be some input to generate a change (learning). This could be via Firmware updates but for something to be really 'artificial' the input should really be from within the camera (for example the software analyses the photo and sees the bird is out of focus, either in real time or directly after the event, and makes changes to its futurs actions, according to the original programmed parameters. Sony seem to have cracked the focus when it detects the birds eye somewhere in the frame but when it can't detect the eye or estimate a nearby spot (say next to the beak of in the middle of the head), I'm not sure about the depth of the current learning when it can't choose between a stem of grass and a bird shaped object! There is a long way to go but then again it could be fixed with a team of brilliant engineers!

Cheers, here are a couple of the better ones:

1750694134635.jpeg


1750694075417.jpeg
 

New in Marketplace

Back
Top