DETAILED ANALYSIS ON BALANCE OF PAYMENTS ON CAPITAL ACCOUNT
Keywords:
Capital account, pre-reform periodsAbstract
An analysis of the data and a conclusion on its significance usually follow one another. Analysis's ultimate goal is to build a mental model in which all the necessary connections have been painstakingly made so that important and plausible inferences may be drawn. It's crucial to perform data analysis in a way that allows for a solid connection to be made between the findings and the study's stated goals. Considering the overarching principles and the consequences that flow from them is what we mean when we say that we are interpreting something. Both the implications of the study and its meanings become readily apparent when some thought is given to the interpretation of the data. You can't have an accurate interpretation without a thorough analysis, and you can't have a thorough analysis without an accurate interpretation. As a result, their existence is heavily reliant on the existence of the other. Actually, interpretation may be considered a part of the analytical process. In this research, the trend increases of the current bank element of the equilibrium balance of payments are investigated during the course of the time of interest covered by the study. The efficiency of the current account both during and after the implementation of the New Planning Commission was one of the objectives of the research, which aimed to assess the effectiveness of the New Economic Policy. A regression model, compound growth rates, and the Chow test were used in order to investigate how efficient the debit card really is
References
Al-Hami, M., Maabreh, M., Taamneh, S., Pradeep, A., & Salameh, H. B. (2021). Apache hadoop performance evaluation with resources monitoring tools, and parameters optimization: Iot emerging demand. Journal of Theoretical and Applied Information Technology, 99(11), 2734–2750.
Dean, J., & Ghemawat, S. (2010). MapReduce advantages over parallel databases include storage-system independence and fine-grain fault tolerance for large jobs. MapReduce: A Flexible Data Processing Tool. Communications of the Acm | Ja N Uary, 53(1).
Ganguly, P. (2020). Big Data Analytics: Using Hadoop Inspired MapReduce and Apache Spark. 7(2), 72–82.
Lin, J., & Dyer, C. (2010a). Data-intensive text processing with MapReduce. Synthesis Lectures on Human Language Technologies, 3(1), 1–177. https://doi.org/10.2200/S00274ED1V01Y201006HLT007
Lin, J., & Dyer, C. (2010b). MapReduce Basics. Data-Intensive Text Processing with MapReduce, 17–35. https://doi.org/10.1007/978-3-031-02136-7_2
Ma, Z., & Gu, L. (2010). The limitation of MapReduce: A probing case and a lightweight solution. Cloud Computing 2010, c, 68–73.