Over the last month, there has been a steady increase in soft 404s in my google webmaster report. It started at 8 urls and now is up to 529. All the pages exist, but they are all in a folder protected by amember, which made me wonder what could be the issue. I have used this setup for years, and this situation only started on April 12th. Has anyone else encountered this? Any help or advice would be hugely appreciated.
Sounds like you have links to the pages, but the googlebot cant access them as they are amember protected. Maybe adding the protected folders to your disallow in robot.txt will prevent the bot from looking and giving 404. David