+1: if you cannot do security, you have no business making dating apps. The kind of data those collect can ruin lives overnight. This is not a theory, here is a recent example: https://www.bbc.com/news/articles/c74nlgyv7r4o
When I was a student I was leading a project where we made a timeclock web software.
I enforced a no-login policy, because I didn't want potential users to even think about entering a password into a form on the website. I didn't trust myself or my group to handle it correctly, so I decided it was best to just side-step the problem. Naturally this made the application a lot less useful - but it was a student project, who cares.
Software engineering students have an obligation to ethics just like all other engineers. We need to think these things through, and decide if we even want to implement features. And we need to be thinking in terms of risk, not design.
Storing sensitive data is risky, even if you're really talented. Companies will try to put processes in place to mitigate that risk. But students are almost certainly not doing that, so they should be questioning if they should even be doing what they're doing in the first place.
I would agree with you. Dating app data might not be legally protected like some PII out there, but there are easily foreseeable bad consequences from compromised dating app data of any kind. Security should be accounted for from the very beginning.
If you cannot do security, you have no business making any app people use in significant numbers containing Personally Identifiable Information (PII).
Perhaps, like GDPR, HIPAA, and similar, any (web|platform)apps that contain login details and/or PII must thoroughly distance themselves from haphazard, organic, unprofessional, and (bad) amateurish processes and technologies and conform to trusted, proven patterns, processes, and technologies that are tested, audited, and preferably formally proven for correctness. Without formalization and professional standards, there are no standards and these preventable, reinvent-the-wheel-badly hacks will continue doing the same thing and expecting a different result™. Massive hacks, circumvention, scary bugs, other attacks will continue. And, I think this means a proper amount of accreditation, routine auditing, and (the scary word, but smartly) regulation to drag the industry (kicking-and-screaming if need by by showing using appropriate leadership on the government/NGO-SGE side) from an under-structured wild west™ into professionalism.
The claim that it should have come up in a government vetting process seems to be proof that one should publish one's own dating information before entrusting it to a site that might have lost it or worse might provide it to a government specifically.
Your statement similar to : If you cannot cook an egg on the normal pan without sticking problem, you should not serve food in chicken.
They are merely unconstructive statement, developer have free will, they spent time and money to make the app, customer spent time and money to use their app. If there are any mistakes, util you prove that they were intentional harm the customer - or - violating the contract of data safety between the app and the customer, they are free to keep their business. The free market will decide what will happen next.
And the link you gave as an example was just made nonsense. The victim was being fired from the position which worked for security of government because he did not have honesty from the start, did not inform that he use a dating app. With his private data in a dating app, even if they were not leaked: the data can be exchanged illegally in the background, which can lead to social engineering, harm the government and nation he is working for. Actually, that firm and the nation was lucky that his data was being leaked - on purpose by someone. It was his vault.