Check out example codes for "robots txt allow folder disallow subfolder". It will help you in understanding the concepts better.

Code Example 1

User-agent: *
Allow: /news/$
Disallow: /news/
Explanation:

Google's robots.txt spec 
(https://developers.google.com/search/reference/robots_txt), 
which is more up to date than the "official" spec, states that:

/fish/ will match anything in the /fish/ folder but will not match 
/fish (and, no wildcard necessary, since "The trailing slash means 
this matches anything in this folder.") If you kinda reverse engineer that:

User-agent: * (or whatever user agent you want to talk to)
Allow: /news/$ (allows /news/ but the $ character says the allow 
can't go beyond /news/)
Disallow: /news/ (disallows anything in the /news/ folder)

Test it in Google Search Console, or in Yandex 
(https://webmaster.yandex.com/tools/robotstxt/) 
to ensure it works for your site.

Learn ReactJs, React Native from akashmittal.com